Apr 16 19:53:41.112857 ip-10-0-135-244 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:41.529975 ip-10-0-135-244 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:41.529975 ip-10-0-135-244 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:41.529975 ip-10-0-135-244 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:41.529975 ip-10-0-135-244 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:41.529975 ip-10-0-135-244 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:41.531523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.531441 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:41.535393 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535378 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535394 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535400 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535403 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535407 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535410 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535413 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535416 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535419 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535422 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535425 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535428 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535431 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535433 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535436 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535439 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:41.535435 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535442 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535445 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535453 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535456 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535459 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535461 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535464 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535466 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535469 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535471 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535474 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535476 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535479 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535482 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535485 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535487 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535490 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535492 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535495 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535497 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:41.535842 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535500 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535502 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535506 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535508 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535511 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535513 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535516 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535518 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535521 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535524 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535526 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535529 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535531 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535534 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535537 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535540 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535543 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535546 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535548 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:41.536328 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535551 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535553 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535556 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535560 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535564 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535567 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535570 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535572 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535575 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535578 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535580 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535583 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535585 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535588 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535591 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535613 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535618 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535623 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535627 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:41.536813 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535631 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535635 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535638 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535640 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535643 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535646 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535649 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535652 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535655 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535658 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535661 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.535663 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536029 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536034 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536037 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536040 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536042 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536045 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536048 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536051 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:41.537270 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536053 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536056 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536058 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536061 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536063 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536066 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536069 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536071 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536074 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536076 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536079 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536081 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536083 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536087 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536089 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536092 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536095 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536097 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536100 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:41.537814 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536103 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536108 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536112 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536114 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536117 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536120 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536123 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536126 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536130 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536134 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536137 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536140 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536143 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536145 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536148 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536151 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536153 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536156 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536159 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:41.538281 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536162 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536164 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536167 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536169 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536172 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536174 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536177 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536179 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536182 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536185 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536188 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536190 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536194 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536197 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536200 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536202 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536205 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536207 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536210 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:41.538764 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536212 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536215 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536217 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536220 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536222 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536224 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536227 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536229 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536232 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536234 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536237 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536240 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536242 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536244 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536247 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536249 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536252 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536254 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536257 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536259 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536261 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:41.539236 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536340 2563 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536348 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536355 2563 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536359 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536364 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536367 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536372 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536376 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536380 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536383 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536386 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536390 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536393 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536396 2563 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536398 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536401 2563 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536404 2563 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536407 2563 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536410 2563 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536414 2563 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536417 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536420 2563 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536423 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536426 2563 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:41.539763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536430 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536433 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536436 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536439 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536442 2563 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536444 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536447 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536450 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536453 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536457 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536461 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536464 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536467 2563 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536470 2563 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536472 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536477 2563 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536480 2563 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536483 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536486 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536489 2563 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536492 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536495 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536498 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536501 2563 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536504 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:41.540339 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536507 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536510 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536513 2563 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536516 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536519 2563 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536522 2563 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536526 2563 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536529 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536532 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536535 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536538 2563 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536541 2563 flags.go:64] FLAG: --help="false" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536544 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536548 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536551 2563 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536553 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536557 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536560 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536564 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536567 2563 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536570 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536572 2563 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536575 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536578 2563 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:41.540959 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536581 2563 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536584 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536587 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536590 2563 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536592 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536595 2563 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536612 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536615 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536618 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536623 2563 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536626 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536629 2563 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536632 2563 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536636 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536639 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536642 2563 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536645 2563 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536649 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536652 2563 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536656 2563 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536660 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536663 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536666 2563 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536669 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536672 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:41.541548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536675 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536678 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536686 2563 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536689 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536692 2563 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536696 2563 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536699 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536704 2563 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536707 2563 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536710 2563 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536713 2563 flags.go:64] FLAG: --port="10250" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536716 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536719 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-032b7c662515760d7" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536722 2563 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536725 2563 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536728 2563 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536731 2563 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536733 2563 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536737 2563 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536740 2563 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536743 2563 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536746 2563 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536750 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536753 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536756 2563 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:41.542249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536758 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536761 2563 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536764 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536767 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536770 2563 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536773 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536776 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536779 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536782 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536785 2563 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536789 2563 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536791 2563 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536794 2563 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536797 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536800 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536803 2563 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536806 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536811 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536816 2563 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536819 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536823 2563 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536826 2563 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536829 2563 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536832 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536835 2563 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:41.542901 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536841 2563 flags.go:64] FLAG: --v="2" Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536846 2563 flags.go:64] FLAG: --version="false" Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536850 2563 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536854 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.536857 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536943 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536946 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536949 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536953 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536955 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536958 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536960 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536963 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536966 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536969 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536971 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536974 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536976 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536980 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536982 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:41.543502 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536985 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536987 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536990 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536992 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536995 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.536999 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537001 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537004 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537006 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537009 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537011 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537014 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537018 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537020 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537023 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537026 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537028 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537032 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537036 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537039 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:41.544010 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537042 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537045 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537048 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537051 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537054 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537056 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537059 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537062 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537064 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537067 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537069 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537073 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537076 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537079 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537081 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537084 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537086 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537091 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537094 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:41.544530 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537096 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537099 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537101 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537104 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537106 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537110 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537112 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537115 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537117 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537120 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537122 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537125 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537128 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537130 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537133 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537135 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537138 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537140 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537143 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537145 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:41.545005 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537148 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537150 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537153 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537155 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537158 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537161 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537163 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537167 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537170 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537173 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537177 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.537179 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.537924 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.545023 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.545038 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:41.545527 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545083 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545087 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545091 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545094 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545097 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545100 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545103 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545106 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545108 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545111 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545113 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545116 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545118 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545121 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545123 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545126 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545129 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545131 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545134 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545136 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:41.545925 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545139 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545141 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545145 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545148 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545151 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545153 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545156 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545158 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545161 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545163 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545166 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545169 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545172 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545175 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545179 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545183 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545186 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545189 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545192 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:41.546451 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545196 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545200 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545203 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545206 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545209 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545212 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545214 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545217 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545219 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545221 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545224 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545227 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545230 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545232 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545235 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545237 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545240 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545243 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545245 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545248 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:41.546940 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545250 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545253 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545256 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545258 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545261 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545264 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545266 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545269 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545271 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545274 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545276 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545279 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545281 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545284 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545286 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545289 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545292 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545294 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545297 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545299 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:41.547420 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545302 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545304 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545307 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545309 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545312 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545314 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545317 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.545322 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545412 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545417 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545420 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545422 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545425 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545428 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545431 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545433 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:41.547926 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545436 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545439 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545442 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545444 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545447 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545458 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545461 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545464 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545467 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545469 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545472 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545475 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545477 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545480 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545482 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545485 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545487 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545490 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545493 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545495 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:41.548324 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545498 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545500 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545503 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545505 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545508 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545510 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545514 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545517 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545519 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545522 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545524 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545526 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545529 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545532 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545535 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545538 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545540 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545543 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545545 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545548 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:41.548827 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545550 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545553 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545555 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545559 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545563 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545566 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545568 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545571 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545574 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545577 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545579 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545582 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545586 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545588 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545591 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545593 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545596 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545610 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545613 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:41.549297 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545616 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545619 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545621 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545624 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545627 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545630 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545632 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545635 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545638 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545641 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545643 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545646 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545649 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545652 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545654 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545657 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545659 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545662 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:41.549775 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:41.545664 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:41.550227 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.545669 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:41.550227 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.546459 2563 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:41.550227 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.549475 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:41.550456 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.550444 2563 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:41.550565 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.550547 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:41.550597 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.550585 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:41.572545 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.572528 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:41.574891 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.574873 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:41.588042 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.588025 2563 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:41.593836 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.593821 2563 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:41.595091 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.595076 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:41.598947 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.598928 2563 fs.go:135] Filesystem UUIDs: map[53602253-f971-4961-83f2-b6d7898b5071:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d4b8fa90-73f5-45f9-8ac5-e260b05380e8:/dev/nvme0n1p3] Apr 16 19:53:41.599007 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.598945 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:41.603865 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.603849 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:41.604138 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.604034 2563 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:41.602893261 +0000 UTC m=+0.379470250 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099775 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21a4c1965d50a522c6fb497be7de74 SystemUUID:ec21a4c1-965d-50a5-22c6-fb497be7de74 BootID:e9304490-c19f-46bb-864f-572cc1beb052 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:19:a9:14:89:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:19:a9:14:89:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:f4:56:f4:05:c2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:41.604138 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.604135 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:41.604246 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.604235 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:41.605268 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.605244 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:41.605398 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.605271 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-244.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:41.605440 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.605407 2563 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:41.605440 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.605416 2563 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:41.605440 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.605433 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:41.606876 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.606865 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:41.608563 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.608553 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:41.608808 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.608798 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:41.610802 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.610792 2563 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:41.610833 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.610806 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:41.610833 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.610823 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:41.610833 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.610833 2563 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:41.610921 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.610842 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:41.611889 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.611877 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:41.611931 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.611895 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:41.614933 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.614918 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:41.616545 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.616532 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:41.618176 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618164 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:41.618230 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618184 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:41.618230 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618217 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:41.618230 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618224 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:41.618230 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618230 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:41.618342 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618236 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:41.618342 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618242 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:41.618342 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618247 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:41.618342 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618254 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:41.618342 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618260 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:41.618342 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618269 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:41.618342 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.618277 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:41.619117 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.619108 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:41.619117 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.619117 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:41.622627 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.622613 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:41.622734 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.622648 2563 server.go:1295] "Started kubelet" Apr 16 19:53:41.622789 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.622729 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:41.622850 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.622774 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:41.622850 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.622835 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:41.623462 ip-10-0-135-244 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:41.623847 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.623825 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:41.623935 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.623882 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:41.624118 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.624006 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-244.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:41.625195 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.625183 2563 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:41.626729 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.626706 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:41.631479 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.630724 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-244.ec2.internal.18a6ee653cc0ed19 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-244.ec2.internal,UID:ip-10-0-135-244.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-244.ec2.internal,},FirstTimestamp:2026-04-16 19:53:41.622623513 +0000 UTC m=+0.399200503,LastTimestamp:2026-04-16 19:53:41.622623513 +0000 UTC m=+0.399200503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-244.ec2.internal,}" Apr 16 19:53:41.634181 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.634154 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6x4vl" Apr 16 19:53:41.634579 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.634559 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:41.634683 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.634645 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:41.635124 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.635106 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:41.635828 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.635808 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:41.635916 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.635831 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:41.635916 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.635806 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:41.635916 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.635901 2563 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:41.636028 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.635927 2563 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:41.636091 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.636074 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:41.636458 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.636417 2563 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:53:41.636645 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.636626 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:53:41.636763 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.636750 2563 factory.go:55] Registering systemd factory Apr 16 19:53:41.636812 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.636774 2563 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:41.637026 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.637010 2563 factory.go:153] Registering CRI-O factory Apr 16 19:53:41.637081 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.637030 2563 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:41.637081 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.637076 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:41.637191 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.637093 2563 factory.go:103] Registering Raw factory Apr 16 19:53:41.637191 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.637107 2563 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:41.638108 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.638094 2563 manager.go:319] Starting recovery of all containers Apr 16 19:53:41.640725 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.640705 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6x4vl" Apr 16 19:53:41.647347 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.647331 2563 manager.go:324] Recovery completed Apr 16 19:53:41.651217 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.651205 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:41.655328 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.655313 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:41.655403 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.655340 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:41.655403 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.655350 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:41.655832 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.655814 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:41.655832 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.655833 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:41.655958 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.655849 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:41.658684 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.658665 2563 policy_none.go:49] "None policy: Start" Apr 16 19:53:41.658684 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.658681 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:41.658756 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.658690 2563 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:41.697187 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.697147 2563 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.697215 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.697225 2563 server.go:85] "Starting device plugin registration server" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.697411 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.697420 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.697521 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.697619 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.697632 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.698066 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:41.701620 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.698104 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:41.792617 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.792526 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:41.793665 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.793637 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:41.793748 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.793673 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:41.793748 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.793696 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:41.793748 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.793706 2563 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:41.793748 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.793745 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:41.797290 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.797271 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:41.798310 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.798289 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:41.801731 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.801718 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:41.801795 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.801741 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:41.801795 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.801751 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:41.801795 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.801772 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.809905 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.809891 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.809950 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.809911 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-244.ec2.internal\": node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:41.823439 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.823420 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:41.894003 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.893984 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal"] Apr 16 19:53:41.894067 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.894041 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:41.895591 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.895576 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:41.895673 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.895613 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:41.895673 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.895623 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:41.896701 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.896689 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:41.896820 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.896807 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.896862 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.896832 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:41.897361 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.897337 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:41.897361 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.897350 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:41.897513 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.897366 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:41.897513 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.897375 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:41.897513 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.897381 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:41.897513 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.897385 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:41.898510 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.898491 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.898585 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.898520 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:41.899183 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.899166 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:41.899277 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.899191 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:41.899277 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.899203 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:41.923654 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.923633 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:41.934134 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.934111 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-244.ec2.internal\" not found" node="ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.937150 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.937134 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00ed7f10126a5d8aca706b75de740849-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal\" (UID: \"00ed7f10126a5d8aca706b75de740849\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.937218 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.937157 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c2fe7a6e5acf4d9e84afac6e0df862e1-config\") pod \"kube-apiserver-proxy-ip-10-0-135-244.ec2.internal\" (UID: \"c2fe7a6e5acf4d9e84afac6e0df862e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.937218 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:41.937179 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00ed7f10126a5d8aca706b75de740849-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal\" (UID: \"00ed7f10126a5d8aca706b75de740849\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:41.938430 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:41.938412 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-244.ec2.internal\" not found" node="ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.023790 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.023771 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:42.037320 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.037302 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00ed7f10126a5d8aca706b75de740849-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal\" (UID: \"00ed7f10126a5d8aca706b75de740849\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.037368 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.037326 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00ed7f10126a5d8aca706b75de740849-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal\" (UID: \"00ed7f10126a5d8aca706b75de740849\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.037368 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.037344 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c2fe7a6e5acf4d9e84afac6e0df862e1-config\") pod \"kube-apiserver-proxy-ip-10-0-135-244.ec2.internal\" (UID: \"c2fe7a6e5acf4d9e84afac6e0df862e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.037431 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.037377 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c2fe7a6e5acf4d9e84afac6e0df862e1-config\") pod \"kube-apiserver-proxy-ip-10-0-135-244.ec2.internal\" (UID: \"c2fe7a6e5acf4d9e84afac6e0df862e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.037431 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.037402 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00ed7f10126a5d8aca706b75de740849-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal\" (UID: \"00ed7f10126a5d8aca706b75de740849\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.037431 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.037402 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00ed7f10126a5d8aca706b75de740849-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal\" (UID: \"00ed7f10126a5d8aca706b75de740849\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.124734 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.124684 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:42.225217 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.225184 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:42.236399 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.236379 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.242307 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.242291 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.325522 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.325496 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:42.426101 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.426023 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-244.ec2.internal\" not found" Apr 16 19:53:42.450096 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.450077 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:42.500169 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.500152 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:42.536211 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.536190 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.548187 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.548169 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:42.549665 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.549651 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.550589 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.550575 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:42.550743 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.550719 2563 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a1c00b575f7e541d3863001ca20ccbd8-d51b52652ada41e3.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.135.244:42542->32.192.140.192:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:53:42.550782 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.550737 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:42.550782 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.550745 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:42.550782 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.550762 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:42.611890 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.611873 2563 apiserver.go:52] "Watching apiserver" Apr 16 19:53:42.631168 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.631136 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:42.632190 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.632170 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-pxn88","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2","openshift-cluster-node-tuning-operator/tuned-tbhz7","openshift-image-registry/node-ca-nm7vd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal","openshift-multus/multus-additional-cni-plugins-vqw9q","openshift-multus/multus-rjds9","openshift-multus/network-metrics-daemon-p54df","openshift-network-diagnostics/network-check-target-chnql","openshift-network-operator/iptables-alerter-sg2qr","openshift-ovn-kubernetes/ovnkube-node-8s7w4"] Apr 16 19:53:42.634554 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.634540 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.634689 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.634672 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:42.637089 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.636962 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.637089 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.637079 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.638077 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.637879 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bcbns\"" Apr 16 19:53:42.638077 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.637883 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:42.638077 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.637937 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:42.638643 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.638624 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.639508 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.639486 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:42.639631 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.639536 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:42.639631 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.639548 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zj89f\"" Apr 16 19:53:42.639631 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.639620 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sh69q\"" Apr 16 19:53:42.639844 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.639652 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:42.639844 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.639733 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:42.639938 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.639880 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:42.640199 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640182 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.640429 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640397 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysctl-d\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640550 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640433 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysctl-conf\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640550 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640458 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-lib-modules\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640550 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640481 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-host\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640550 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640511 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjqbz\" (UniqueName: \"kubernetes.io/projected/940f6882-9538-4742-9cdc-585d4ceabae6-kube-api-access-bjqbz\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.640550 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640538 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-modprobe-d\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640578 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysconfig\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640626 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5488e199-2008-42c4-ab06-666d5ec0e2bf-konnectivity-ca\") pod \"konnectivity-agent-pxn88\" (UID: \"5488e199-2008-42c4-ab06-666d5ec0e2bf\") " pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640649 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-socket-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640667 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-device-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640703 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-sys-fs\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640735 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640782 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-systemd\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640807 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-var-lib-kubelet\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640830 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-tmp\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640851 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640856 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4dx9c\"" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640857 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640857 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5488e199-2008-42c4-ab06-666d5ec0e2bf-agent-certs\") pod \"konnectivity-agent-pxn88\" (UID: \"5488e199-2008-42c4-ab06-666d5ec0e2bf\") " pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.640980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.640981 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-registration-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641036 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-kubernetes\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641063 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-run\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641097 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-sys\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641110 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641120 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-tuned\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641143 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwws\" (UniqueName: \"kubernetes.io/projected/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-kube-api-access-crwws\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641176 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-etc-selinux\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.641551 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.641442 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.642743 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.642723 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:42.642832 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.642791 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:42.643185 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643166 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-75vmn\"" Apr 16 19:53:42.643282 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643186 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:42.643282 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643167 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:42.643282 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643197 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:42.643282 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643172 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:42.643282 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643166 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:42.643558 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643527 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:41 +0000 UTC" deadline="2028-01-20 12:48:42.227656372 +0000 UTC" Apr 16 19:53:42.643558 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643556 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15448h54m59.584102468s" Apr 16 19:53:42.643807 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643793 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:42.643855 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643806 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gjj4h\"" Apr 16 19:53:42.643952 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.643937 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:42.644018 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.644000 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:42.645490 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.645474 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.645568 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.645507 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:42.647065 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.647047 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.647803 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.647774 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:42.647890 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.647803 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:42.647890 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.647815 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sr4qz\"" Apr 16 19:53:42.647890 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.647786 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:42.650414 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.650239 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:42.650414 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.650251 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:42.650414 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.650260 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:42.650414 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.650288 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:42.650414 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.650247 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:42.650723 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.650533 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:42.650723 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.650660 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vkgx9\"" Apr 16 19:53:42.668297 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.668276 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w77bx" Apr 16 19:53:42.679284 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.679230 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w77bx" Apr 16 19:53:42.715881 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:42.715854 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2fe7a6e5acf4d9e84afac6e0df862e1.slice/crio-f7a6249afadedb6194e133db41733c00985492860b4d12a561a66588489d63ce WatchSource:0}: Error finding container f7a6249afadedb6194e133db41733c00985492860b4d12a561a66588489d63ce: Status 404 returned error can't find the container with id f7a6249afadedb6194e133db41733c00985492860b4d12a561a66588489d63ce Apr 16 19:53:42.716128 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:42.716106 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ed7f10126a5d8aca706b75de740849.slice/crio-f4833b7d6fa1e8f9af4eea7da7415d890304a1521d3733d7e662a751aa1ff81e WatchSource:0}: Error finding container f4833b7d6fa1e8f9af4eea7da7415d890304a1521d3733d7e662a751aa1ff81e: Status 404 returned error can't find the container with id f4833b7d6fa1e8f9af4eea7da7415d890304a1521d3733d7e662a751aa1ff81e Apr 16 19:53:42.720286 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.720252 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:42.734222 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.734204 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:42.736920 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.736905 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:42.741513 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741495 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5488e199-2008-42c4-ab06-666d5ec0e2bf-agent-certs\") pod \"konnectivity-agent-pxn88\" (UID: \"5488e199-2008-42c4-ab06-666d5ec0e2bf\") " pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.741586 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741517 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-kubernetes\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.741586 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741534 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-tuned\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.741586 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741553 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.741586 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741581 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-etc-selinux\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.741809 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741624 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-kubernetes\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.741809 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741649 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:42.741809 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741688 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-slash\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.741809 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741659 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-etc-selinux\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.741809 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741712 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-run-netns\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.741809 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741797 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-etc-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741828 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5037aa30-5243-46c1-9238-71a0ee0cc436-ovn-node-metrics-cert\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741837 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741878 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-socket-dir-parent\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741904 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttpgz\" (UniqueName: \"kubernetes.io/projected/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-kube-api-access-ttpgz\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741952 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0c44e61-db3b-4f44-bfc4-d928140603e4-cni-binary-copy\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.741981 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-cni-multus\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742010 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-systemd\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742036 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-tmp\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.742079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742061 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6bm\" (UniqueName: \"kubernetes.io/projected/5037aa30-5243-46c1-9238-71a0ee0cc436-kube-api-access-5j6bm\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742095 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-conf-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742116 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-systemd\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742124 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-registration-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742175 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-registration-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742188 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crwws\" (UniqueName: \"kubernetes.io/projected/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-kube-api-access-crwws\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742218 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlst\" (UniqueName: \"kubernetes.io/projected/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-kube-api-access-9dlst\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742242 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-kubelet\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742263 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-ovnkube-config\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742284 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-cni-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742307 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-netns\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742329 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-run-ovn-kubernetes\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742352 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-ovnkube-script-lib\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742377 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-kubelet\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742413 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-modprobe-d\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742439 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5488e199-2008-42c4-ab06-666d5ec0e2bf-konnectivity-ca\") pod \"konnectivity-agent-pxn88\" (UID: \"5488e199-2008-42c4-ab06-666d5ec0e2bf\") " pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.742495 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742487 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-socket-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742539 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-device-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742560 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-var-lib-kubelet\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742587 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742640 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-host-slash\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742663 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cni-binary-copy\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742690 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742713 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-systemd-units\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742724 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-modprobe-d\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742740 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4dm\" (UniqueName: \"kubernetes.io/projected/b0c44e61-db3b-4f44-bfc4-d928140603e4-kube-api-access-cx4dm\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742780 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-run\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742791 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-var-lib-kubelet\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742805 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-sys\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742832 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742858 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742895 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-socket-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742904 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-hostroot\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.743239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742933 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-multus-certs\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742938 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-device-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742979 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-sys\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742961 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysctl-d\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.742977 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5488e199-2008-42c4-ab06-666d5ec0e2bf-konnectivity-ca\") pod \"konnectivity-agent-pxn88\" (UID: \"5488e199-2008-42c4-ab06-666d5ec0e2bf\") " pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743045 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743051 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743060 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysctl-d\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743089 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-run\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743148 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-system-cni-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743179 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-daemon-config\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743211 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysconfig\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743279 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysconfig\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-systemd\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743378 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-env-overrides\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743424 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cnibin\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743574 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6eefa0ff-7de4-4c45-af84-a83e70151ad6-serviceca\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.743884 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743631 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-var-lib-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743680 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-node-log\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743715 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-os-release\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743738 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eefa0ff-7de4-4c45-af84-a83e70151ad6-host\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743761 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p67xj\" (UniqueName: \"kubernetes.io/projected/81d750f7-8363-48b6-afd3-9847607883b7-kube-api-access-p67xj\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743798 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-cnibin\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743841 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-etc-kubernetes\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743880 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysctl-conf\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743906 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-lib-modules\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743928 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-host\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743953 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-ovn\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.743975 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-log-socket\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744014 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-cni-bin\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744041 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-sysctl-conf\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744097 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-host\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744045 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-cni-netd\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744130 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-lib-modules\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.744449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744141 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-system-cni-dir\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744168 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjqbz\" (UniqueName: \"kubernetes.io/projected/940f6882-9538-4742-9cdc-585d4ceabae6-kube-api-access-bjqbz\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744190 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-sys-fs\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744218 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744242 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-os-release\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744264 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-cni-bin\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744287 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kd52\" (UniqueName: \"kubernetes.io/projected/6eefa0ff-7de4-4c45-af84-a83e70151ad6-kube-api-access-2kd52\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744311 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-iptables-alerter-script\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744332 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-k8s-cni-cncf-io\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744487 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/940f6882-9538-4742-9cdc-585d4ceabae6-sys-fs\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744714 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-tmp\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.744794 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-etc-tuned\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.745158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.745101 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5488e199-2008-42c4-ab06-666d5ec0e2bf-agent-certs\") pod \"konnectivity-agent-pxn88\" (UID: \"5488e199-2008-42c4-ab06-666d5ec0e2bf\") " pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.749851 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.749835 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwws\" (UniqueName: \"kubernetes.io/projected/41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4-kube-api-access-crwws\") pod \"tuned-tbhz7\" (UID: \"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4\") " pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:42.751830 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.751813 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjqbz\" (UniqueName: \"kubernetes.io/projected/940f6882-9538-4742-9cdc-585d4ceabae6-kube-api-access-bjqbz\") pod \"aws-ebs-csi-driver-node-64nb2\" (UID: \"940f6882-9538-4742-9cdc-585d4ceabae6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.796890 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.796849 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" event={"ID":"00ed7f10126a5d8aca706b75de740849","Type":"ContainerStarted","Data":"f4833b7d6fa1e8f9af4eea7da7415d890304a1521d3733d7e662a751aa1ff81e"} Apr 16 19:53:42.798545 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.798520 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" event={"ID":"c2fe7a6e5acf4d9e84afac6e0df862e1","Type":"ContainerStarted","Data":"f7a6249afadedb6194e133db41733c00985492860b4d12a561a66588489d63ce"} Apr 16 19:53:42.844753 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844732 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eefa0ff-7de4-4c45-af84-a83e70151ad6-host\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.844837 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844759 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p67xj\" (UniqueName: \"kubernetes.io/projected/81d750f7-8363-48b6-afd3-9847607883b7-kube-api-access-p67xj\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:42.844837 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844775 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-cnibin\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.844837 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844798 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-etc-kubernetes\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.844837 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844830 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-etc-kubernetes\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844837 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eefa0ff-7de4-4c45-af84-a83e70151ad6-host\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844860 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-cnibin\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844931 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-ovn\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844973 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-log-socket\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844989 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-ovn\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.844998 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-cni-bin\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845019 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-log-socket\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845036 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845026 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-cni-netd\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845051 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-system-cni-dir\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845069 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-cni-bin\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845078 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845089 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-cni-netd\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845105 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-os-release\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845132 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-cni-bin\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845150 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-os-release\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845155 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kd52\" (UniqueName: \"kubernetes.io/projected/6eefa0ff-7de4-4c45-af84-a83e70151ad6-kube-api-access-2kd52\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845178 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-cni-bin\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845180 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-iptables-alerter-script\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845207 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-k8s-cni-cncf-io\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845108 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-system-cni-dir\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845235 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845282 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-k8s-cni-cncf-io\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845281 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845289 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:42.845465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845321 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-slash\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845346 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-run-netns\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-etc-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845386 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-slash\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845394 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5037aa30-5243-46c1-9238-71a0ee0cc436-ovn-node-metrics-cert\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.845405 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845391 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-run-netns\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845421 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-etc-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845445 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-socket-dir-parent\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.845470 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs podName:81d750f7-8363-48b6-afd3-9847607883b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:43.345442617 +0000 UTC m=+2.122019601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs") pod "network-metrics-daemon-p54df" (UID: "81d750f7-8363-48b6-afd3-9847607883b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845509 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-socket-dir-parent\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845554 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttpgz\" (UniqueName: \"kubernetes.io/projected/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-kube-api-access-ttpgz\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845571 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845594 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0c44e61-db3b-4f44-bfc4-d928140603e4-cni-binary-copy\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845636 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-cni-multus\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845669 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-cni-multus\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845724 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6bm\" (UniqueName: \"kubernetes.io/projected/5037aa30-5243-46c1-9238-71a0ee0cc436-kube-api-access-5j6bm\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845741 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-conf-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845752 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-iptables-alerter-script\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845768 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlst\" (UniqueName: \"kubernetes.io/projected/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-kube-api-access-9dlst\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845794 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-kubelet\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845802 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-conf-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845817 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-ovnkube-config\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845842 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-cni-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845864 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-netns\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845889 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-run-ovn-kubernetes\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845913 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-ovnkube-script-lib\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845864 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-kubelet\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845924 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-cni-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845936 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-kubelet\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845963 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-host-run-ovn-kubernetes\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845975 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-var-lib-kubelet\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.845983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846016 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-host-slash\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.846868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846040 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cni-binary-copy\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846058 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-netns\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846065 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-systemd-units\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846089 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4dm\" (UniqueName: \"kubernetes.io/projected/b0c44e61-db3b-4f44-bfc4-d928140603e4-kube-api-access-cx4dm\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846109 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-host-slash\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846114 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846146 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0c44e61-db3b-4f44-bfc4-d928140603e4-cni-binary-copy\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846147 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846202 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-hostroot\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846230 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-multus-certs\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846241 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846259 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846288 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-system-cni-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846312 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-daemon-config\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846340 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-systemd\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846366 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-env-overrides\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846392 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cnibin\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846421 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6eefa0ff-7de4-4c45-af84-a83e70151ad6-serviceca\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846445 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-var-lib-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846461 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-system-cni-dir\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846470 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-node-log\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846492 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-ovnkube-config\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846499 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-os-release\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846507 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-hostroot\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846562 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846574 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-node-log\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846631 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-var-lib-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846652 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cnibin\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846676 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-openvswitch\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846704 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-systemd-units\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846719 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b0c44e61-db3b-4f44-bfc4-d928140603e4-host-run-multus-certs\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846773 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5037aa30-5243-46c1-9238-71a0ee0cc436-run-systemd\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846774 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-os-release\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846885 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-env-overrides\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.846902 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6eefa0ff-7de4-4c45-af84-a83e70151ad6-serviceca\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.847914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.847041 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b0c44e61-db3b-4f44-bfc4-d928140603e4-multus-daemon-config\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.848386 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.847174 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5037aa30-5243-46c1-9238-71a0ee0cc436-ovnkube-script-lib\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.848386 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.847235 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-cni-binary-copy\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.848386 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.847644 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5037aa30-5243-46c1-9238-71a0ee0cc436-ovn-node-metrics-cert\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.852419 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.852401 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:42.852479 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.852426 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:42.852479 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.852441 2563 projected.go:194] Error preparing data for projected volume kube-api-access-7xjrh for pod openshift-network-diagnostics/network-check-target-chnql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:42.852592 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:42.852501 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh podName:c7e55932-e28c-4952-86fc-0a2e235083be nodeName:}" failed. No retries permitted until 2026-04-16 19:53:43.352484111 +0000 UTC m=+2.129061106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7xjrh" (UniqueName: "kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh") pod "network-check-target-chnql" (UID: "c7e55932-e28c-4952-86fc-0a2e235083be") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:42.852710 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.852696 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p67xj\" (UniqueName: \"kubernetes.io/projected/81d750f7-8363-48b6-afd3-9847607883b7-kube-api-access-p67xj\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:42.857308 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.857283 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4dm\" (UniqueName: \"kubernetes.io/projected/b0c44e61-db3b-4f44-bfc4-d928140603e4-kube-api-access-cx4dm\") pod \"multus-rjds9\" (UID: \"b0c44e61-db3b-4f44-bfc4-d928140603e4\") " pod="openshift-multus/multus-rjds9" Apr 16 19:53:42.857308 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.857300 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlst\" (UniqueName: \"kubernetes.io/projected/4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb-kube-api-access-9dlst\") pod \"iptables-alerter-sg2qr\" (UID: \"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb\") " pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:42.857687 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.857670 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kd52\" (UniqueName: \"kubernetes.io/projected/6eefa0ff-7de4-4c45-af84-a83e70151ad6-kube-api-access-2kd52\") pod \"node-ca-nm7vd\" (UID: \"6eefa0ff-7de4-4c45-af84-a83e70151ad6\") " pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:42.857804 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.857789 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttpgz\" (UniqueName: \"kubernetes.io/projected/fedbf08e-3ecd-47fe-bbea-4ca1def89a98-kube-api-access-ttpgz\") pod \"multus-additional-cni-plugins-vqw9q\" (UID: \"fedbf08e-3ecd-47fe-bbea-4ca1def89a98\") " pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:42.857968 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.857954 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6bm\" (UniqueName: \"kubernetes.io/projected/5037aa30-5243-46c1-9238-71a0ee0cc436-kube-api-access-5j6bm\") pod \"ovnkube-node-8s7w4\" (UID: \"5037aa30-5243-46c1-9238-71a0ee0cc436\") " pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:42.969238 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.969028 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:53:42.975587 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:42.975564 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5488e199_2008_42c4_ab06_666d5ec0e2bf.slice/crio-11dcd4ffc74632453daf56eca8eb07903abac7a248f458ae0979a26d983fabfe WatchSource:0}: Error finding container 11dcd4ffc74632453daf56eca8eb07903abac7a248f458ae0979a26d983fabfe: Status 404 returned error can't find the container with id 11dcd4ffc74632453daf56eca8eb07903abac7a248f458ae0979a26d983fabfe Apr 16 19:53:42.979448 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.979427 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" Apr 16 19:53:42.984919 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:42.984892 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940f6882_9538_4742_9cdc_585d4ceabae6.slice/crio-178ba944ca3c036eae2aa36026faa69718263071e230a691fd8b5ee217f52855 WatchSource:0}: Error finding container 178ba944ca3c036eae2aa36026faa69718263071e230a691fd8b5ee217f52855: Status 404 returned error can't find the container with id 178ba944ca3c036eae2aa36026faa69718263071e230a691fd8b5ee217f52855 Apr 16 19:53:42.998708 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:42.998693 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" Apr 16 19:53:43.004443 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:43.004413 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41cdc007_f7dd_4cb0_9634_b6c0f69c8ff4.slice/crio-d6bd3415693055becee0ef3834429ada65240c4c16949b6d8a4dde6c0b8691c8 WatchSource:0}: Error finding container d6bd3415693055becee0ef3834429ada65240c4c16949b6d8a4dde6c0b8691c8: Status 404 returned error can't find the container with id d6bd3415693055becee0ef3834429ada65240c4c16949b6d8a4dde6c0b8691c8 Apr 16 19:53:43.023012 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.022993 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nm7vd" Apr 16 19:53:43.028242 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:43.028222 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eefa0ff_7de4_4c45_af84_a83e70151ad6.slice/crio-a91db738d00d0f93942fca51cf0bda0694ebf31f5ceace8d35d8995fa129e146 WatchSource:0}: Error finding container a91db738d00d0f93942fca51cf0bda0694ebf31f5ceace8d35d8995fa129e146: Status 404 returned error can't find the container with id a91db738d00d0f93942fca51cf0bda0694ebf31f5ceace8d35d8995fa129e146 Apr 16 19:53:43.028242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.028235 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" Apr 16 19:53:43.034391 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.034371 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rjds9" Apr 16 19:53:43.035065 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:43.035033 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfedbf08e_3ecd_47fe_bbea_4ca1def89a98.slice/crio-0339955edf85161670529c69f91c153ab8677c5aab5c93ac6ed677f64a1b4b85 WatchSource:0}: Error finding container 0339955edf85161670529c69f91c153ab8677c5aab5c93ac6ed677f64a1b4b85: Status 404 returned error can't find the container with id 0339955edf85161670529c69f91c153ab8677c5aab5c93ac6ed677f64a1b4b85 Apr 16 19:53:43.039965 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:43.039948 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0c44e61_db3b_4f44_bfc4_d928140603e4.slice/crio-95782ec759015f6e454f6f922f5758d8e61b7665802a0110723bfac5446f5c8a WatchSource:0}: Error finding container 95782ec759015f6e454f6f922f5758d8e61b7665802a0110723bfac5446f5c8a: Status 404 returned error can't find the container with id 95782ec759015f6e454f6f922f5758d8e61b7665802a0110723bfac5446f5c8a Apr 16 19:53:43.040866 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.040847 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sg2qr" Apr 16 19:53:43.046268 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:43.046248 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a94789b_3b6e_4baf_b5d0_edbc3d2d18cb.slice/crio-646da545ab4d571c4077da91dfd0c52cd8c3e46762a539b666d2f3e96fafe36e WatchSource:0}: Error finding container 646da545ab4d571c4077da91dfd0c52cd8c3e46762a539b666d2f3e96fafe36e: Status 404 returned error can't find the container with id 646da545ab4d571c4077da91dfd0c52cd8c3e46762a539b666d2f3e96fafe36e Apr 16 19:53:43.046980 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.046964 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:53:43.052557 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:53:43.052538 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5037aa30_5243_46c1_9238_71a0ee0cc436.slice/crio-86dfbfa365f7ac2f15791562f58f72a258c67c9f8d34766c0be7adc4e20ff246 WatchSource:0}: Error finding container 86dfbfa365f7ac2f15791562f58f72a258c67c9f8d34766c0be7adc4e20ff246: Status 404 returned error can't find the container with id 86dfbfa365f7ac2f15791562f58f72a258c67c9f8d34766c0be7adc4e20ff246 Apr 16 19:53:43.349526 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.349372 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:43.349526 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:43.349516 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:43.349756 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:43.349584 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs podName:81d750f7-8363-48b6-afd3-9847607883b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:44.34956511 +0000 UTC m=+3.126142100 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs") pod "network-metrics-daemon-p54df" (UID: "81d750f7-8363-48b6-afd3-9847607883b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:43.450738 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.450183 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:43.450738 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:43.450336 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:43.450738 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:43.450356 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:43.450738 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:43.450368 2563 projected.go:194] Error preparing data for projected volume kube-api-access-7xjrh for pod openshift-network-diagnostics/network-check-target-chnql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:43.450738 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:43.450422 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh podName:c7e55932-e28c-4952-86fc-0a2e235083be nodeName:}" failed. No retries permitted until 2026-04-16 19:53:44.450404275 +0000 UTC m=+3.226981256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7xjrh" (UniqueName: "kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh") pod "network-check-target-chnql" (UID: "c7e55932-e28c-4952-86fc-0a2e235083be") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:43.650964 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.650681 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:43.680391 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.680305 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:42 +0000 UTC" deadline="2027-12-15 13:47:16.549869751 +0000 UTC" Apr 16 19:53:43.680391 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.680340 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14585h53m32.869533675s" Apr 16 19:53:43.796352 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.795834 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:43.796352 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:43.795965 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:43.814277 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.814244 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"86dfbfa365f7ac2f15791562f58f72a258c67c9f8d34766c0be7adc4e20ff246"} Apr 16 19:53:43.815852 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.815806 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sg2qr" event={"ID":"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb","Type":"ContainerStarted","Data":"646da545ab4d571c4077da91dfd0c52cd8c3e46762a539b666d2f3e96fafe36e"} Apr 16 19:53:43.825789 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.825763 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rjds9" event={"ID":"b0c44e61-db3b-4f44-bfc4-d928140603e4","Type":"ContainerStarted","Data":"95782ec759015f6e454f6f922f5758d8e61b7665802a0110723bfac5446f5c8a"} Apr 16 19:53:43.835366 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.835342 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerStarted","Data":"0339955edf85161670529c69f91c153ab8677c5aab5c93ac6ed677f64a1b4b85"} Apr 16 19:53:43.856171 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.856137 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" event={"ID":"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4","Type":"ContainerStarted","Data":"d6bd3415693055becee0ef3834429ada65240c4c16949b6d8a4dde6c0b8691c8"} Apr 16 19:53:43.869813 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.869633 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pxn88" event={"ID":"5488e199-2008-42c4-ab06-666d5ec0e2bf","Type":"ContainerStarted","Data":"11dcd4ffc74632453daf56eca8eb07903abac7a248f458ae0979a26d983fabfe"} Apr 16 19:53:43.886921 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.886894 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nm7vd" event={"ID":"6eefa0ff-7de4-4c45-af84-a83e70151ad6","Type":"ContainerStarted","Data":"a91db738d00d0f93942fca51cf0bda0694ebf31f5ceace8d35d8995fa129e146"} Apr 16 19:53:43.903021 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:43.902959 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" event={"ID":"940f6882-9538-4742-9cdc-585d4ceabae6","Type":"ContainerStarted","Data":"178ba944ca3c036eae2aa36026faa69718263071e230a691fd8b5ee217f52855"} Apr 16 19:53:44.358667 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:44.358018 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:44.358667 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:44.358194 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:44.358667 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:44.358253 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs podName:81d750f7-8363-48b6-afd3-9847607883b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:46.358235272 +0000 UTC m=+5.134812251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs") pod "network-metrics-daemon-p54df" (UID: "81d750f7-8363-48b6-afd3-9847607883b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:44.458985 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:44.458856 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:44.459160 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:44.459057 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:44.459160 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:44.459080 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:44.459160 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:44.459093 2563 projected.go:194] Error preparing data for projected volume kube-api-access-7xjrh for pod openshift-network-diagnostics/network-check-target-chnql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:44.459378 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:44.459168 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh podName:c7e55932-e28c-4952-86fc-0a2e235083be nodeName:}" failed. No retries permitted until 2026-04-16 19:53:46.459137769 +0000 UTC m=+5.235714752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7xjrh" (UniqueName: "kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh") pod "network-check-target-chnql" (UID: "c7e55932-e28c-4952-86fc-0a2e235083be") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:44.642632 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:44.642537 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:44.680667 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:44.680613 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:42 +0000 UTC" deadline="2028-01-17 13:45:21.954524898 +0000 UTC" Apr 16 19:53:44.680667 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:44.680649 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15377h51m37.273879385s" Apr 16 19:53:44.794218 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:44.794184 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:44.794368 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:44.794316 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:45.735848 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:45.735789 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:45.794766 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:45.794738 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:45.794931 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:45.794883 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:46.375803 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:46.375767 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:46.375993 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:46.375932 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:46.376062 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:46.376002 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs podName:81d750f7-8363-48b6-afd3-9847607883b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:50.375983355 +0000 UTC m=+9.152560346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs") pod "network-metrics-daemon-p54df" (UID: "81d750f7-8363-48b6-afd3-9847607883b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:46.476660 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:46.476618 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:46.476837 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:46.476780 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:46.476837 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:46.476800 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:46.476837 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:46.476813 2563 projected.go:194] Error preparing data for projected volume kube-api-access-7xjrh for pod openshift-network-diagnostics/network-check-target-chnql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:46.476994 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:46.476877 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh podName:c7e55932-e28c-4952-86fc-0a2e235083be nodeName:}" failed. No retries permitted until 2026-04-16 19:53:50.476857414 +0000 UTC m=+9.253434394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7xjrh" (UniqueName: "kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh") pod "network-check-target-chnql" (UID: "c7e55932-e28c-4952-86fc-0a2e235083be") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:46.794550 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:46.794476 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:46.794994 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:46.794591 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:47.795411 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:47.794872 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:47.795411 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:47.795047 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:48.794434 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:48.793989 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:48.794434 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:48.794130 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:49.794759 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:49.794716 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:49.795208 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:49.794839 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:50.410319 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:50.410272 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:50.410513 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:50.410432 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:50.410513 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:50.410505 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs podName:81d750f7-8363-48b6-afd3-9847607883b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.410486795 +0000 UTC m=+17.187063775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs") pod "network-metrics-daemon-p54df" (UID: "81d750f7-8363-48b6-afd3-9847607883b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:50.511300 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:50.511258 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:50.511485 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:50.511430 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:50.511485 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:50.511455 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:50.511485 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:50.511467 2563 projected.go:194] Error preparing data for projected volume kube-api-access-7xjrh for pod openshift-network-diagnostics/network-check-target-chnql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:50.511669 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:50.511524 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh podName:c7e55932-e28c-4952-86fc-0a2e235083be nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.51151068 +0000 UTC m=+17.288087665 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7xjrh" (UniqueName: "kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh") pod "network-check-target-chnql" (UID: "c7e55932-e28c-4952-86fc-0a2e235083be") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:50.794299 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:50.794167 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:50.794445 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:50.794299 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:51.795635 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:51.795563 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:51.796110 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:51.795704 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:52.794324 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:52.794249 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:52.794470 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:52.794364 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:53.794862 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:53.794825 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:53.795312 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:53.794972 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:54.794790 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:54.794754 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:54.794968 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:54.794887 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:55.794969 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:55.794930 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:55.795323 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:55.795031 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:56.794346 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:56.794310 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:56.794527 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:56.794469 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:57.314994 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.314967 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vpg7x"] Apr 16 19:53:57.353784 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.353761 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.353943 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:57.353832 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:53:57.462014 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.461977 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.462186 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.462057 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e922f21a-2e9d-4d74-9bbf-9f154ed71518-kubelet-config\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.462186 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.462076 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e922f21a-2e9d-4d74-9bbf-9f154ed71518-dbus\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.563280 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.563249 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e922f21a-2e9d-4d74-9bbf-9f154ed71518-kubelet-config\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.563455 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.563287 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e922f21a-2e9d-4d74-9bbf-9f154ed71518-dbus\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.563455 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.563333 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e922f21a-2e9d-4d74-9bbf-9f154ed71518-kubelet-config\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.563455 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.563353 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.563618 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:57.563474 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:57.563618 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.563479 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e922f21a-2e9d-4d74-9bbf-9f154ed71518-dbus\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:57.563618 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:57.563539 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret podName:e922f21a-2e9d-4d74-9bbf-9f154ed71518 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.063520468 +0000 UTC m=+16.840097465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret") pod "global-pull-secret-syncer-vpg7x" (UID: "e922f21a-2e9d-4d74-9bbf-9f154ed71518") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:57.794156 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:57.794085 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:57.794299 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:57.794255 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:53:58.068217 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:58.068139 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:58.068351 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.068248 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.068351 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.068301 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret podName:e922f21a-2e9d-4d74-9bbf-9f154ed71518 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:59.068286418 +0000 UTC m=+17.844863415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret") pod "global-pull-secret-syncer-vpg7x" (UID: "e922f21a-2e9d-4d74-9bbf-9f154ed71518") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.470909 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:58.470828 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:58.471286 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.470949 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.471286 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.471008 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs podName:81d750f7-8363-48b6-afd3-9847607883b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.470993608 +0000 UTC m=+33.247570585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs") pod "network-metrics-daemon-p54df" (UID: "81d750f7-8363-48b6-afd3-9847607883b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.571581 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:58.571552 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:58.571773 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.571727 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:58.571773 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.571752 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:58.571773 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.571767 2563 projected.go:194] Error preparing data for projected volume kube-api-access-7xjrh for pod openshift-network-diagnostics/network-check-target-chnql: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.571927 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.571826 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh podName:c7e55932-e28c-4952-86fc-0a2e235083be nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.571805919 +0000 UTC m=+33.348382920 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7xjrh" (UniqueName: "kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh") pod "network-check-target-chnql" (UID: "c7e55932-e28c-4952-86fc-0a2e235083be") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.794580 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:58.794514 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:58.794730 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:58.794514 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:53:58.794730 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.794620 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:53:58.794730 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:58.794690 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:53:59.075121 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:59.075048 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:53:59.075267 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:59.075198 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:59.075267 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:59.075263 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret podName:e922f21a-2e9d-4d74-9bbf-9f154ed71518 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.075248139 +0000 UTC m=+19.851825115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret") pod "global-pull-secret-syncer-vpg7x" (UID: "e922f21a-2e9d-4d74-9bbf-9f154ed71518") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:59.794826 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:53:59.794797 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:53:59.795265 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:53:59.794924 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:00.794379 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.794358 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:00.794464 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:00.794443 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:54:00.794464 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.794358 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:00.794568 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:00.794545 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:54:00.938138 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.938110 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" event={"ID":"c2fe7a6e5acf4d9e84afac6e0df862e1","Type":"ContainerStarted","Data":"00c2ee540becc0642d6203b97c74eda103a4c28f43eb447d891f286c939bcbf7"} Apr 16 19:54:00.938713 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.938321 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:54:00.942670 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.942522 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 19:54:00.943776 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.943746 2563 generic.go:358] "Generic (PLEG): container finished" podID="5037aa30-5243-46c1-9238-71a0ee0cc436" containerID="a2aa9c1cc33296a858c094caa4b343011909ab499b1cc9cc93fb08165cbbc97a" exitCode=1 Apr 16 19:54:00.943872 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.943819 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"da8788d3d2f25a203eec0e01d390925b667da26c87e893aabb9ba490771b3fa5"} Apr 16 19:54:00.943872 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.943842 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"a080a61a64877de97d2c8c023a56e2e7e5581418461bf09d742535d549c970de"} Apr 16 19:54:00.943872 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.943858 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerDied","Data":"a2aa9c1cc33296a858c094caa4b343011909ab499b1cc9cc93fb08165cbbc97a"} Apr 16 19:54:00.944007 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.943873 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"cc75793edb3c4c372a9be840726d733e4d2e11aae01151808efb89c965158228"} Apr 16 19:54:00.949842 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.949824 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:00.950192 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.950171 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal"] Apr 16 19:54:00.952641 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.952592 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rjds9" event={"ID":"b0c44e61-db3b-4f44-bfc4-d928140603e4","Type":"ContainerStarted","Data":"97f2f0971d3d11c9e6a00b018e6e67a60165f8b023f3c04f842d9e5c79f3e56c"} Apr 16 19:54:00.958155 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.958130 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" event={"ID":"41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4","Type":"ContainerStarted","Data":"bdb98ac530ebb9a7fc453bc8cfb6d2e4346545118ced57a9874fe3a1655e010c"} Apr 16 19:54:00.977573 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.977502 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rjds9" podStartSLOduration=2.360800153 podStartE2EDuration="19.977486431s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:43.041373771 +0000 UTC m=+1.817950748" lastFinishedPulling="2026-04-16 19:54:00.658060038 +0000 UTC m=+19.434637026" observedRunningTime="2026-04-16 19:54:00.976824158 +0000 UTC m=+19.753401157" watchObservedRunningTime="2026-04-16 19:54:00.977486431 +0000 UTC m=+19.754063430" Apr 16 19:54:00.997878 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:00.997733 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" podStartSLOduration=0.997716771 podStartE2EDuration="997.716771ms" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:00.997433726 +0000 UTC m=+19.774010727" watchObservedRunningTime="2026-04-16 19:54:00.997716771 +0000 UTC m=+19.774293794" Apr 16 19:54:01.027066 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.026971 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tbhz7" podStartSLOduration=2.522380495 podStartE2EDuration="20.026954217s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:43.005876673 +0000 UTC m=+1.782453651" lastFinishedPulling="2026-04-16 19:54:00.510450382 +0000 UTC m=+19.287027373" observedRunningTime="2026-04-16 19:54:01.026091836 +0000 UTC m=+19.802668836" watchObservedRunningTime="2026-04-16 19:54:01.026954217 +0000 UTC m=+19.803531218" Apr 16 19:54:01.092747 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.092709 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:01.092922 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:01.092850 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.092922 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:01.092914 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret podName:e922f21a-2e9d-4d74-9bbf-9f154ed71518 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.092896701 +0000 UTC m=+23.869473681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret") pod "global-pull-secret-syncer-vpg7x" (UID: "e922f21a-2e9d-4d74-9bbf-9f154ed71518") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.795424 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.795353 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:01.795558 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:01.795463 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:01.961945 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.961907 2563 generic.go:358] "Generic (PLEG): container finished" podID="00ed7f10126a5d8aca706b75de740849" containerID="b9bdf35202548d69f80fc5213c96485ee1358506dfbcb4e6f7c38610235a4a0d" exitCode=0 Apr 16 19:54:01.962362 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.961950 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" event={"ID":"00ed7f10126a5d8aca706b75de740849","Type":"ContainerDied","Data":"b9bdf35202548d69f80fc5213c96485ee1358506dfbcb4e6f7c38610235a4a0d"} Apr 16 19:54:01.963670 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.963634 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nm7vd" event={"ID":"6eefa0ff-7de4-4c45-af84-a83e70151ad6","Type":"ContainerStarted","Data":"ae3e48b8a0d7bded2337d7c4cdb3f314bfd22b4b3643cfe6f02daf969b79d261"} Apr 16 19:54:01.965332 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.965313 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" event={"ID":"940f6882-9538-4742-9cdc-585d4ceabae6","Type":"ContainerStarted","Data":"da7e1131e137a8def74ae69668fa9bdc435e76f9abe828509b25f4ffb1d8dbc2"} Apr 16 19:54:01.968196 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.968176 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 19:54:01.968587 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.968541 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"26466daac9619a201cd495d805ca7bbb515fdaca145e9a4e91b84375c67a7a17"} Apr 16 19:54:01.968587 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.968566 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"af9d87e160df47860238377f0dfd8503de5306c3ea30c1e0871068fb11875d85"} Apr 16 19:54:01.970048 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.970016 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sg2qr" event={"ID":"4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb","Type":"ContainerStarted","Data":"60a928b0846b7d666584cd33ab5701ac89a7a64293beedd75c484a0e8a9edfa6"} Apr 16 19:54:01.971673 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.971578 2563 generic.go:358] "Generic (PLEG): container finished" podID="fedbf08e-3ecd-47fe-bbea-4ca1def89a98" containerID="eadd21916bcfc95764df05a197d0855a54782eb1e96401a816aa597032f78374" exitCode=0 Apr 16 19:54:01.971673 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.971644 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerDied","Data":"eadd21916bcfc95764df05a197d0855a54782eb1e96401a816aa597032f78374"} Apr 16 19:54:01.973573 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.973545 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pxn88" event={"ID":"5488e199-2008-42c4-ab06-666d5ec0e2bf","Type":"ContainerStarted","Data":"c6ef2a4ef2d41a954b5139a0b701316b00977726ecb5fecaa6158a8f0e4b7685"} Apr 16 19:54:01.973981 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.973967 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:54:01.985110 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:01.985089 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:01.985190 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:01.985140 2563 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-proxy-ip-10-0-135-244.ec2.internal\" already exists" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-244.ec2.internal" Apr 16 19:54:02.016893 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.016842 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nm7vd" podStartSLOduration=3.533870645 podStartE2EDuration="21.016826913s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:43.030466597 +0000 UTC m=+1.807043576" lastFinishedPulling="2026-04-16 19:54:00.51342286 +0000 UTC m=+19.289999844" observedRunningTime="2026-04-16 19:54:01.991871681 +0000 UTC m=+20.768448682" watchObservedRunningTime="2026-04-16 19:54:02.016826913 +0000 UTC m=+20.793403915" Apr 16 19:54:02.032082 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.032042 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sg2qr" podStartSLOduration=3.563581211 podStartE2EDuration="21.032029962s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:43.047984843 +0000 UTC m=+1.824561819" lastFinishedPulling="2026-04-16 19:54:00.516433585 +0000 UTC m=+19.293010570" observedRunningTime="2026-04-16 19:54:02.031753274 +0000 UTC m=+20.808330274" watchObservedRunningTime="2026-04-16 19:54:02.032029962 +0000 UTC m=+20.808606961" Apr 16 19:54:02.047477 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.047380 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pxn88" podStartSLOduration=3.510999359 podStartE2EDuration="21.047366761s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:42.977072263 +0000 UTC m=+1.753649239" lastFinishedPulling="2026-04-16 19:54:00.513439647 +0000 UTC m=+19.290016641" observedRunningTime="2026-04-16 19:54:02.04650597 +0000 UTC m=+20.823082969" watchObservedRunningTime="2026-04-16 19:54:02.047366761 +0000 UTC m=+20.823943761" Apr 16 19:54:02.351400 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.351379 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:02.426051 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.426021 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:54:02.426813 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.426795 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:54:02.711858 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.711714 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:02.351396381Z","UUID":"9c432cdc-15f9-4893-8930-7b8c15e93b19","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:02.713225 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.713198 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:02.713225 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.713225 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:02.794550 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.794518 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:02.794719 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.794527 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:02.794719 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:02.794670 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:54:02.794823 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:02.794750 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:54:02.978668 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.978121 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" event={"ID":"00ed7f10126a5d8aca706b75de740849","Type":"ContainerStarted","Data":"f49accb973c0a6cc6ada2e1c2ec92411b8f1358d397aa2c761ffab5cc631137b"} Apr 16 19:54:02.981043 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.981017 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" event={"ID":"940f6882-9538-4742-9cdc-585d4ceabae6","Type":"ContainerStarted","Data":"ea17356359a1ae64ee0ad80ecb6495d37d54404801323b5d96c54252b5636f22"} Apr 16 19:54:02.981541 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.981512 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:54:02.982026 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:02.982008 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pxn88" Apr 16 19:54:03.001964 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:03.001892 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-244.ec2.internal" podStartSLOduration=21.001880867 podStartE2EDuration="21.001880867s" podCreationTimestamp="2026-04-16 19:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:03.001359191 +0000 UTC m=+21.777936193" watchObservedRunningTime="2026-04-16 19:54:03.001880867 +0000 UTC m=+21.778457866" Apr 16 19:54:03.794292 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:03.794104 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:03.794514 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:03.794376 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:03.984924 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:03.984885 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" event={"ID":"940f6882-9538-4742-9cdc-585d4ceabae6","Type":"ContainerStarted","Data":"5d832e568f738d648e5a044b111a91a26a7d5b227702f47c0145790560b40fdd"} Apr 16 19:54:03.988490 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:03.988467 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 19:54:03.988842 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:03.988816 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"980753fe3ced74b6361813fc4735f7801339567d4cf9c54daf207ae61919ff18"} Apr 16 19:54:04.477529 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.477471 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-64nb2" podStartSLOduration=3.2857548850000002 podStartE2EDuration="23.477445516s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:42.986361311 +0000 UTC m=+1.762938288" lastFinishedPulling="2026-04-16 19:54:03.178051928 +0000 UTC m=+21.954628919" observedRunningTime="2026-04-16 19:54:04.007665063 +0000 UTC m=+22.784242062" watchObservedRunningTime="2026-04-16 19:54:04.477445516 +0000 UTC m=+23.254022519" Apr 16 19:54:04.478029 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.478012 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6hmqh"] Apr 16 19:54:04.482523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.482497 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.485623 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.485576 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:04.485623 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.485617 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:04.485786 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.485629 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-r5qgr\"" Apr 16 19:54:04.619306 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.619268 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdcd8eab-4705-4045-bbc6-5974072ac6dd-hosts-file\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.619485 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.619315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fdcd8eab-4705-4045-bbc6-5974072ac6dd-tmp-dir\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.619485 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.619398 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxtvb\" (UniqueName: \"kubernetes.io/projected/fdcd8eab-4705-4045-bbc6-5974072ac6dd-kube-api-access-pxtvb\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.719822 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.719796 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdcd8eab-4705-4045-bbc6-5974072ac6dd-hosts-file\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.720002 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.719830 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fdcd8eab-4705-4045-bbc6-5974072ac6dd-tmp-dir\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.720002 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.719928 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdcd8eab-4705-4045-bbc6-5974072ac6dd-hosts-file\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.720108 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.719985 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxtvb\" (UniqueName: \"kubernetes.io/projected/fdcd8eab-4705-4045-bbc6-5974072ac6dd-kube-api-access-pxtvb\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.720258 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.720240 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fdcd8eab-4705-4045-bbc6-5974072ac6dd-tmp-dir\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.736456 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.736398 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxtvb\" (UniqueName: \"kubernetes.io/projected/fdcd8eab-4705-4045-bbc6-5974072ac6dd-kube-api-access-pxtvb\") pod \"node-resolver-6hmqh\" (UID: \"fdcd8eab-4705-4045-bbc6-5974072ac6dd\") " pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.792211 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.792183 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6hmqh" Apr 16 19:54:04.793935 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.793829 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:04.794053 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:04.793942 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:54:04.794109 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:04.794035 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:04.794201 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:04.794178 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:54:05.123539 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:05.123460 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:05.124248 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:05.123717 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:05.124248 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:05.123793 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret podName:e922f21a-2e9d-4d74-9bbf-9f154ed71518 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.123774235 +0000 UTC m=+31.900351219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret") pod "global-pull-secret-syncer-vpg7x" (UID: "e922f21a-2e9d-4d74-9bbf-9f154ed71518") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:05.794358 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:05.794322 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:05.794536 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:05.794464 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:06.318271 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:06.318232 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdcd8eab_4705_4045_bbc6_5974072ac6dd.slice/crio-fc89d9506232ea311c365761351f274a454fc58f72c84780eb7bca948fb7c352 WatchSource:0}: Error finding container fc89d9506232ea311c365761351f274a454fc58f72c84780eb7bca948fb7c352: Status 404 returned error can't find the container with id fc89d9506232ea311c365761351f274a454fc58f72c84780eb7bca948fb7c352 Apr 16 19:54:06.795028 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.794881 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:06.795126 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.794914 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:06.795126 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:06.795095 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:54:06.795206 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:06.795166 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:54:06.994823 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.994751 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6hmqh" event={"ID":"fdcd8eab-4705-4045-bbc6-5974072ac6dd","Type":"ContainerStarted","Data":"5425b207fd6809e0934731398286969347745ca6b76affd6937c53062e2f6b40"} Apr 16 19:54:06.994823 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.994796 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6hmqh" event={"ID":"fdcd8eab-4705-4045-bbc6-5974072ac6dd","Type":"ContainerStarted","Data":"fc89d9506232ea311c365761351f274a454fc58f72c84780eb7bca948fb7c352"} Apr 16 19:54:06.997489 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.997470 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 19:54:06.997814 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.997796 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"8c26d99f207efaf0903a1b7904162052fe26b413719835fef83eb4d7daf63112"} Apr 16 19:54:06.998118 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.998098 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:54:06.998220 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.998124 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:54:06.998220 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.998137 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:54:06.998335 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.998221 2563 scope.go:117] "RemoveContainer" containerID="a2aa9c1cc33296a858c094caa4b343011909ab499b1cc9cc93fb08165cbbc97a" Apr 16 19:54:06.999815 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.999788 2563 generic.go:358] "Generic (PLEG): container finished" podID="fedbf08e-3ecd-47fe-bbea-4ca1def89a98" containerID="51e726eef5cd85d800ce834d28ade4f21f0e79fe15b385db3b4a48ee658f33db" exitCode=0 Apr 16 19:54:06.999924 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:06.999832 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerDied","Data":"51e726eef5cd85d800ce834d28ade4f21f0e79fe15b385db3b4a48ee658f33db"} Apr 16 19:54:07.009378 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:07.009330 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6hmqh" podStartSLOduration=3.009314857 podStartE2EDuration="3.009314857s" podCreationTimestamp="2026-04-16 19:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:07.008757726 +0000 UTC m=+25.785334948" watchObservedRunningTime="2026-04-16 19:54:07.009314857 +0000 UTC m=+25.785891874" Apr 16 19:54:07.013125 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:07.013089 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:54:07.013384 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:07.013369 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:54:07.794352 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:07.794317 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:07.794827 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:07.794429 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:08.005239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.005075 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 19:54:08.005576 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.005548 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" event={"ID":"5037aa30-5243-46c1-9238-71a0ee0cc436","Type":"ContainerStarted","Data":"d9536bca589e67cb04a51156346be6c2c729c2da2fe1cfaf9e0238b3d78c06e7"} Apr 16 19:54:08.007413 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.007391 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerStarted","Data":"1673d49da0f2c5694709ea1b9bd883fae7813f63b179efd60f20578aa0e06022"} Apr 16 19:54:08.061930 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.061850 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" podStartSLOduration=9.464618851 podStartE2EDuration="27.061837026s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:43.054201389 +0000 UTC m=+1.830778366" lastFinishedPulling="2026-04-16 19:54:00.651419562 +0000 UTC m=+19.427996541" observedRunningTime="2026-04-16 19:54:08.061398068 +0000 UTC m=+26.837975078" watchObservedRunningTime="2026-04-16 19:54:08.061837026 +0000 UTC m=+26.838414025" Apr 16 19:54:08.341373 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.341300 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vpg7x"] Apr 16 19:54:08.341576 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.341441 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:08.341576 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:08.341564 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:54:08.341981 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.341960 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p54df"] Apr 16 19:54:08.342064 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.342048 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:08.342165 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:08.342148 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:08.345165 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.345146 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-chnql"] Apr 16 19:54:08.345253 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:08.345241 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:08.345347 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:08.345328 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:54:09.011092 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:09.011058 2563 generic.go:358] "Generic (PLEG): container finished" podID="fedbf08e-3ecd-47fe-bbea-4ca1def89a98" containerID="1673d49da0f2c5694709ea1b9bd883fae7813f63b179efd60f20578aa0e06022" exitCode=0 Apr 16 19:54:09.011580 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:09.011143 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerDied","Data":"1673d49da0f2c5694709ea1b9bd883fae7813f63b179efd60f20578aa0e06022"} Apr 16 19:54:09.794279 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:09.794244 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:09.794398 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:09.794381 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:54:09.794450 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:09.794428 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:09.794525 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:09.794508 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:54:10.015511 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:10.015480 2563 generic.go:358] "Generic (PLEG): container finished" podID="fedbf08e-3ecd-47fe-bbea-4ca1def89a98" containerID="186e6464a890ca1434c4a28c7a197ccd227c5ea6fc35c536bb338d4ced326afc" exitCode=0 Apr 16 19:54:10.016106 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:10.015548 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerDied","Data":"186e6464a890ca1434c4a28c7a197ccd227c5ea6fc35c536bb338d4ced326afc"} Apr 16 19:54:10.794805 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:10.794774 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:10.794977 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:10.794919 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:11.795803 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:11.795772 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:11.796236 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:11.795874 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chnql" podUID="c7e55932-e28c-4952-86fc-0a2e235083be" Apr 16 19:54:11.796236 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:11.795933 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:11.796236 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:11.796018 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vpg7x" podUID="e922f21a-2e9d-4d74-9bbf-9f154ed71518" Apr 16 19:54:12.794047 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:12.794018 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:12.794199 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:12.794124 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p54df" podUID="81d750f7-8363-48b6-afd3-9847607883b7" Apr 16 19:54:13.190548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.190429 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:13.190992 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.190562 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:13.190992 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.190657 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret podName:e922f21a-2e9d-4d74-9bbf-9f154ed71518 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.190638247 +0000 UTC m=+47.967215227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret") pod "global-pull-secret-syncer-vpg7x" (UID: "e922f21a-2e9d-4d74-9bbf-9f154ed71518") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:13.586638 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.586557 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-244.ec2.internal" event="NodeReady" Apr 16 19:54:13.586788 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.586722 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:13.626443 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.626411 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7cc4c694b8-2m66w"] Apr 16 19:54:13.652025 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.652001 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.663618 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.663451 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cc4c694b8-2m66w"] Apr 16 19:54:13.681968 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.681946 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:13.681968 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.681964 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:13.682280 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.682260 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q97t2\"" Apr 16 19:54:13.683462 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.683443 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:13.689741 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.689715 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dtqvs"] Apr 16 19:54:13.706343 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.706321 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:13.706894 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.706871 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8q4g7"] Apr 16 19:54:13.706998 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.706983 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.710294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.710260 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:13.710432 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.710410 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8flcv\"" Apr 16 19:54:13.710539 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.710490 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:13.726727 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.726707 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dtqvs"] Apr 16 19:54:13.726827 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.726813 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:13.729985 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.729966 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:13.729985 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.729982 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:13.730260 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.730242 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:13.739277 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.739261 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7n2jf\"" Apr 16 19:54:13.740440 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.740422 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8q4g7"] Apr 16 19:54:13.793987 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.793963 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:13.794090 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794030 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.794090 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.793964 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:13.794090 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794067 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-image-registry-private-configuration\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.794244 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794106 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-trusted-ca\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.794244 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794161 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkdts\" (UniqueName: \"kubernetes.io/projected/133663ab-a7a5-4f8a-8659-5dcb18604eed-kube-api-access-hkdts\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.794244 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794200 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-registry-certificates\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.794244 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794226 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-installation-pull-secrets\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.794428 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794251 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-bound-sa-token\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.794428 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794332 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b325581-2334-4c35-ade9-6b22c9297769-ca-trust-extracted\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.794428 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794358 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/133663ab-a7a5-4f8a-8659-5dcb18604eed-tmp-dir\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.794428 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794387 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/133663ab-a7a5-4f8a-8659-5dcb18604eed-config-volume\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.794574 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794492 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.794574 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.794531 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl5xl\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-kube-api-access-gl5xl\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.797233 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.797214 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:13.797359 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.797276 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:13.797359 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.797280 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p7kgs\"" Apr 16 19:54:13.802771 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.802752 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:13.895656 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895624 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-trusted-ca\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.895815 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895669 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkdts\" (UniqueName: \"kubernetes.io/projected/133663ab-a7a5-4f8a-8659-5dcb18604eed-kube-api-access-hkdts\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.895815 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895698 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-registry-certificates\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.895815 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895717 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-installation-pull-secrets\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.895815 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895734 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-bound-sa-token\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.895989 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895877 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b325581-2334-4c35-ade9-6b22c9297769-ca-trust-extracted\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.895989 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895907 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/133663ab-a7a5-4f8a-8659-5dcb18604eed-tmp-dir\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.895989 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.895937 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/133663ab-a7a5-4f8a-8659-5dcb18604eed-config-volume\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.896118 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896001 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.896118 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896025 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl5xl\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-kube-api-access-gl5xl\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.896118 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896053 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pknt\" (UniqueName: \"kubernetes.io/projected/02f069bd-5606-4bac-9784-8646fdf8c979-kube-api-access-7pknt\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:13.896118 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896097 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:13.896306 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896128 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.896306 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896157 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-image-registry-private-configuration\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.896306 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.896166 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:13.896306 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.896180 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cc4c694b8-2m66w: secret "image-registry-tls" not found Apr 16 19:54:13.896306 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.896241 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls podName:7b325581-2334-4c35-ade9-6b22c9297769 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.396222632 +0000 UTC m=+33.172799613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls") pod "image-registry-7cc4c694b8-2m66w" (UID: "7b325581-2334-4c35-ade9-6b22c9297769") : secret "image-registry-tls" not found Apr 16 19:54:13.896542 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896406 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-registry-certificates\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.896580 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896533 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/133663ab-a7a5-4f8a-8659-5dcb18604eed-tmp-dir\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.896658 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.896643 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:13.896710 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896651 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-trusted-ca\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.896768 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.896718 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls podName:133663ab-a7a5-4f8a-8659-5dcb18604eed nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.396701346 +0000 UTC m=+33.173278323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls") pod "dns-default-dtqvs" (UID: "133663ab-a7a5-4f8a-8659-5dcb18604eed") : secret "dns-default-metrics-tls" not found Apr 16 19:54:13.896880 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896860 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b325581-2334-4c35-ade9-6b22c9297769-ca-trust-extracted\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.896880 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.896871 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/133663ab-a7a5-4f8a-8659-5dcb18604eed-config-volume\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.900618 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.900585 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-installation-pull-secrets\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.900721 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.900593 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-image-registry-private-configuration\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.906243 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.906220 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-bound-sa-token\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.906350 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.906273 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkdts\" (UniqueName: \"kubernetes.io/projected/133663ab-a7a5-4f8a-8659-5dcb18604eed-kube-api-access-hkdts\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:13.906514 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.906491 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl5xl\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-kube-api-access-gl5xl\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:13.996731 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.996706 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pknt\" (UniqueName: \"kubernetes.io/projected/02f069bd-5606-4bac-9784-8646fdf8c979-kube-api-access-7pknt\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:13.996868 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:13.996743 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:13.996868 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.996860 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:13.996953 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:13.996914 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert podName:02f069bd-5606-4bac-9784-8646fdf8c979 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.49690202 +0000 UTC m=+33.273479000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert") pod "ingress-canary-8q4g7" (UID: "02f069bd-5606-4bac-9784-8646fdf8c979") : secret "canary-serving-cert" not found Apr 16 19:54:14.012790 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.012760 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pknt\" (UniqueName: \"kubernetes.io/projected/02f069bd-5606-4bac-9784-8646fdf8c979-kube-api-access-7pknt\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:14.398899 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.398849 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:14.399377 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.398924 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:14.399377 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.398957 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:14.399377 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.398983 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cc4c694b8-2m66w: secret "image-registry-tls" not found Apr 16 19:54:14.399377 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.399080 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls podName:7b325581-2334-4c35-ade9-6b22c9297769 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.399059303 +0000 UTC m=+34.175636303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls") pod "image-registry-7cc4c694b8-2m66w" (UID: "7b325581-2334-4c35-ade9-6b22c9297769") : secret "image-registry-tls" not found Apr 16 19:54:14.399377 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.399087 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:14.399377 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.399161 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls podName:133663ab-a7a5-4f8a-8659-5dcb18604eed nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.399143059 +0000 UTC m=+34.175720040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls") pod "dns-default-dtqvs" (UID: "133663ab-a7a5-4f8a-8659-5dcb18604eed") : secret "dns-default-metrics-tls" not found Apr 16 19:54:14.499902 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.499863 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:14.500094 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.499940 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:14.500094 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.500018 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:14.500094 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.500044 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:14.500094 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.500083 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs podName:81d750f7-8363-48b6-afd3-9847607883b7 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.500067059 +0000 UTC m=+65.276644040 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs") pod "network-metrics-daemon-p54df" (UID: "81d750f7-8363-48b6-afd3-9847607883b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:14.500257 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:14.500100 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert podName:02f069bd-5606-4bac-9784-8646fdf8c979 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.500090628 +0000 UTC m=+34.276667619 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert") pod "ingress-canary-8q4g7" (UID: "02f069bd-5606-4bac-9784-8646fdf8c979") : secret "canary-serving-cert" not found Apr 16 19:54:14.600896 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.600856 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:14.603584 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.603559 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjrh\" (UniqueName: \"kubernetes.io/projected/c7e55932-e28c-4952-86fc-0a2e235083be-kube-api-access-7xjrh\") pod \"network-check-target-chnql\" (UID: \"c7e55932-e28c-4952-86fc-0a2e235083be\") " pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:14.708805 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.708729 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:14.794894 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.794864 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:14.798260 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.798233 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2xst7\"" Apr 16 19:54:14.798392 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:14.798347 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:15.405288 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:15.405249 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:15.405858 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:15.405354 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:15.405858 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:15.405389 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:15.405858 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:15.405477 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls podName:133663ab-a7a5-4f8a-8659-5dcb18604eed nodeName:}" failed. No retries permitted until 2026-04-16 19:54:17.405456612 +0000 UTC m=+36.182033594 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls") pod "dns-default-dtqvs" (UID: "133663ab-a7a5-4f8a-8659-5dcb18604eed") : secret "dns-default-metrics-tls" not found Apr 16 19:54:15.405858 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:15.405484 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:15.405858 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:15.405499 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cc4c694b8-2m66w: secret "image-registry-tls" not found Apr 16 19:54:15.405858 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:15.405548 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls podName:7b325581-2334-4c35-ade9-6b22c9297769 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:17.40553238 +0000 UTC m=+36.182109372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls") pod "image-registry-7cc4c694b8-2m66w" (UID: "7b325581-2334-4c35-ade9-6b22c9297769") : secret "image-registry-tls" not found Apr 16 19:54:15.505702 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:15.505671 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:15.505878 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:15.505814 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:15.505930 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:15.505887 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert podName:02f069bd-5606-4bac-9784-8646fdf8c979 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:17.505867772 +0000 UTC m=+36.282444786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert") pod "ingress-canary-8q4g7" (UID: "02f069bd-5606-4bac-9784-8646fdf8c979") : secret "canary-serving-cert" not found Apr 16 19:54:16.132160 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:16.132130 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-chnql"] Apr 16 19:54:16.238046 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:16.237978 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e55932_e28c_4952_86fc_0a2e235083be.slice/crio-fa449e26e1eebd564ef7558c24914ff317b3a68dfa75ec8d9fc56f224a1e8cd7 WatchSource:0}: Error finding container fa449e26e1eebd564ef7558c24914ff317b3a68dfa75ec8d9fc56f224a1e8cd7: Status 404 returned error can't find the container with id fa449e26e1eebd564ef7558c24914ff317b3a68dfa75ec8d9fc56f224a1e8cd7 Apr 16 19:54:17.031671 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.031404 2563 generic.go:358] "Generic (PLEG): container finished" podID="fedbf08e-3ecd-47fe-bbea-4ca1def89a98" containerID="85d751bcceab4ef92d14c41f726a568cd922bc2fcb46dbee29658790192949c0" exitCode=0 Apr 16 19:54:17.032122 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.031472 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerDied","Data":"85d751bcceab4ef92d14c41f726a568cd922bc2fcb46dbee29658790192949c0"} Apr 16 19:54:17.033011 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.032986 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-chnql" event={"ID":"c7e55932-e28c-4952-86fc-0a2e235083be","Type":"ContainerStarted","Data":"fa449e26e1eebd564ef7558c24914ff317b3a68dfa75ec8d9fc56f224a1e8cd7"} Apr 16 19:54:17.420402 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.420370 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:17.420553 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.420460 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:17.420553 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:17.420522 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:17.420553 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:17.420546 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:17.420673 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:17.420558 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cc4c694b8-2m66w: secret "image-registry-tls" not found Apr 16 19:54:17.420673 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:17.420592 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls podName:133663ab-a7a5-4f8a-8659-5dcb18604eed nodeName:}" failed. No retries permitted until 2026-04-16 19:54:21.420574695 +0000 UTC m=+40.197151688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls") pod "dns-default-dtqvs" (UID: "133663ab-a7a5-4f8a-8659-5dcb18604eed") : secret "dns-default-metrics-tls" not found Apr 16 19:54:17.420673 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:17.420630 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls podName:7b325581-2334-4c35-ade9-6b22c9297769 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:21.42062022 +0000 UTC m=+40.197197208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls") pod "image-registry-7cc4c694b8-2m66w" (UID: "7b325581-2334-4c35-ade9-6b22c9297769") : secret "image-registry-tls" not found Apr 16 19:54:17.521617 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.521577 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:17.521768 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:17.521712 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:17.521811 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:17.521771 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert podName:02f069bd-5606-4bac-9784-8646fdf8c979 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:21.521756265 +0000 UTC m=+40.298333242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert") pod "ingress-canary-8q4g7" (UID: "02f069bd-5606-4bac-9784-8646fdf8c979") : secret "canary-serving-cert" not found Apr 16 19:54:17.647214 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.647183 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88"] Apr 16 19:54:17.669666 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.669638 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88"] Apr 16 19:54:17.669833 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.669766 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" Apr 16 19:54:17.672894 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.672838 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 19:54:17.673267 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.673246 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:17.673427 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.673408 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tt6dj\"" Apr 16 19:54:17.823852 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.823806 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7wnx\" (UniqueName: \"kubernetes.io/projected/1ad38f3c-cccf-4846-8c4c-864918ab774f-kube-api-access-w7wnx\") pod \"migrator-74bb7799d9-d6h88\" (UID: \"1ad38f3c-cccf-4846-8c4c-864918ab774f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" Apr 16 19:54:17.924278 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.924196 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7wnx\" (UniqueName: \"kubernetes.io/projected/1ad38f3c-cccf-4846-8c4c-864918ab774f-kube-api-access-w7wnx\") pod \"migrator-74bb7799d9-d6h88\" (UID: \"1ad38f3c-cccf-4846-8c4c-864918ab774f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" Apr 16 19:54:17.934493 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.934465 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7wnx\" (UniqueName: \"kubernetes.io/projected/1ad38f3c-cccf-4846-8c4c-864918ab774f-kube-api-access-w7wnx\") pod \"migrator-74bb7799d9-d6h88\" (UID: \"1ad38f3c-cccf-4846-8c4c-864918ab774f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" Apr 16 19:54:17.980457 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:17.980406 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" Apr 16 19:54:18.039139 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:18.039069 2563 generic.go:358] "Generic (PLEG): container finished" podID="fedbf08e-3ecd-47fe-bbea-4ca1def89a98" containerID="99b0381636941db6c459053d1d845b140f62f44840cfb44c4b4558384feee87a" exitCode=0 Apr 16 19:54:18.039139 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:18.039114 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerDied","Data":"99b0381636941db6c459053d1d845b140f62f44840cfb44c4b4558384feee87a"} Apr 16 19:54:18.111276 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:18.111244 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88"] Apr 16 19:54:18.114411 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:18.114377 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad38f3c_cccf_4846_8c4c_864918ab774f.slice/crio-14434dbd2e17176a6d89c30d8a063bee89069d86a3eb391996e7096bc29a6aff WatchSource:0}: Error finding container 14434dbd2e17176a6d89c30d8a063bee89069d86a3eb391996e7096bc29a6aff: Status 404 returned error can't find the container with id 14434dbd2e17176a6d89c30d8a063bee89069d86a3eb391996e7096bc29a6aff Apr 16 19:54:19.044726 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.044463 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" event={"ID":"fedbf08e-3ecd-47fe-bbea-4ca1def89a98","Type":"ContainerStarted","Data":"be5ad8f2cb8d885b46bc7196fef24c80f39289ffc4756f1b3ae3d21fbf3fd323"} Apr 16 19:54:19.045568 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.045539 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" event={"ID":"1ad38f3c-cccf-4846-8c4c-864918ab774f","Type":"ContainerStarted","Data":"14434dbd2e17176a6d89c30d8a063bee89069d86a3eb391996e7096bc29a6aff"} Apr 16 19:54:19.076746 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.076690 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vqw9q" podStartSLOduration=4.846629634 podStartE2EDuration="38.076672971s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:53:43.037017726 +0000 UTC m=+1.813594703" lastFinishedPulling="2026-04-16 19:54:16.267061063 +0000 UTC m=+35.043638040" observedRunningTime="2026-04-16 19:54:19.07486972 +0000 UTC m=+37.851446717" watchObservedRunningTime="2026-04-16 19:54:19.076672971 +0000 UTC m=+37.853249961" Apr 16 19:54:19.875659 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.875626 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9tsr8"] Apr 16 19:54:19.900675 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.900645 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9tsr8"] Apr 16 19:54:19.900790 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.900687 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:19.903571 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.903542 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:54:19.903571 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.903569 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:54:19.903750 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.903589 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:19.903750 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.903542 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:19.903750 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:19.903569 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8tkc8\"" Apr 16 19:54:20.043450 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.043367 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0200234c-4441-4ee0-a6b1-e543a08da9b8-data-volume\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.043450 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.043436 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j5kt\" (UniqueName: \"kubernetes.io/projected/0200234c-4441-4ee0-a6b1-e543a08da9b8-kube-api-access-4j5kt\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.043711 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.043570 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.043711 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.043677 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0200234c-4441-4ee0-a6b1-e543a08da9b8-crio-socket\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.043811 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.043771 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0200234c-4441-4ee0-a6b1-e543a08da9b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.049708 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.049673 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-chnql" event={"ID":"c7e55932-e28c-4952-86fc-0a2e235083be","Type":"ContainerStarted","Data":"ed6c0fc91fc0017ad91dcec66ba46d9d22e220e380cc92cb99f16f90900305e4"} Apr 16 19:54:20.050121 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.050092 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:20.069157 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.069110 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-chnql" podStartSLOduration=35.567843901 podStartE2EDuration="39.069097976s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.244902668 +0000 UTC m=+35.021479646" lastFinishedPulling="2026-04-16 19:54:19.746156738 +0000 UTC m=+38.522733721" observedRunningTime="2026-04-16 19:54:20.067824993 +0000 UTC m=+38.844401994" watchObservedRunningTime="2026-04-16 19:54:20.069097976 +0000 UTC m=+38.845675027" Apr 16 19:54:20.144896 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.144863 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0200234c-4441-4ee0-a6b1-e543a08da9b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.145074 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.144960 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0200234c-4441-4ee0-a6b1-e543a08da9b8-data-volume\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.145074 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.144992 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j5kt\" (UniqueName: \"kubernetes.io/projected/0200234c-4441-4ee0-a6b1-e543a08da9b8-kube-api-access-4j5kt\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.145074 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.145027 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.145227 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.145114 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0200234c-4441-4ee0-a6b1-e543a08da9b8-crio-socket\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.145227 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:20.145143 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:20.145227 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:20.145201 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls podName:0200234c-4441-4ee0-a6b1-e543a08da9b8 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:20.645184339 +0000 UTC m=+39.421761327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9tsr8" (UID: "0200234c-4441-4ee0-a6b1-e543a08da9b8") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:20.145359 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.145300 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0200234c-4441-4ee0-a6b1-e543a08da9b8-data-volume\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.145359 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.145330 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0200234c-4441-4ee0-a6b1-e543a08da9b8-crio-socket\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.155798 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.155777 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0200234c-4441-4ee0-a6b1-e543a08da9b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.157104 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.157085 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j5kt\" (UniqueName: \"kubernetes.io/projected/0200234c-4441-4ee0-a6b1-e543a08da9b8-kube-api-access-4j5kt\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.648617 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.648562 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:20.648773 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:20.648698 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:20.648773 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:20.648765 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls podName:0200234c-4441-4ee0-a6b1-e543a08da9b8 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:21.648749287 +0000 UTC m=+40.425326264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9tsr8" (UID: "0200234c-4441-4ee0-a6b1-e543a08da9b8") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:20.764657 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.764625 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-stfgk"] Apr 16 19:54:20.789962 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.789936 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-stfgk"] Apr 16 19:54:20.790075 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.789975 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:20.792787 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.792767 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 19:54:20.792787 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.792781 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 19:54:20.792952 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.792804 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 19:54:20.794200 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.794182 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 19:54:20.794301 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.794253 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-p75jn\"" Apr 16 19:54:20.950532 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.950509 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/99245d8b-f916-4b7b-907b-c110d66c41c9-signing-key\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:20.950628 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.950541 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/99245d8b-f916-4b7b-907b-c110d66c41c9-signing-cabundle\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:20.950628 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:20.950566 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxx5\" (UniqueName: \"kubernetes.io/projected/99245d8b-f916-4b7b-907b-c110d66c41c9-kube-api-access-6cxx5\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.050900 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.050878 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/99245d8b-f916-4b7b-907b-c110d66c41c9-signing-key\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.051244 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.050904 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/99245d8b-f916-4b7b-907b-c110d66c41c9-signing-cabundle\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.051244 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.050921 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxx5\" (UniqueName: \"kubernetes.io/projected/99245d8b-f916-4b7b-907b-c110d66c41c9-kube-api-access-6cxx5\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.051564 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.051545 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/99245d8b-f916-4b7b-907b-c110d66c41c9-signing-cabundle\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.052366 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.052341 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" event={"ID":"1ad38f3c-cccf-4846-8c4c-864918ab774f","Type":"ContainerStarted","Data":"ab30dc0bef9115e7995b5f55de1dcea06c08e95ef0ecce00a74370b2bd6b7ddc"} Apr 16 19:54:21.053380 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.053361 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/99245d8b-f916-4b7b-907b-c110d66c41c9-signing-key\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.060726 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.060701 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxx5\" (UniqueName: \"kubernetes.io/projected/99245d8b-f916-4b7b-907b-c110d66c41c9-kube-api-access-6cxx5\") pod \"service-ca-865cb79987-stfgk\" (UID: \"99245d8b-f916-4b7b-907b-c110d66c41c9\") " pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.098765 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.098741 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-stfgk" Apr 16 19:54:21.219597 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.219572 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-stfgk"] Apr 16 19:54:21.232625 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:21.232591 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99245d8b_f916_4b7b_907b_c110d66c41c9.slice/crio-26ebfa0ebbe81f713a240b99e0b1e31b873d3bd6eeff01ec20ef62c327feaada WatchSource:0}: Error finding container 26ebfa0ebbe81f713a240b99e0b1e31b873d3bd6eeff01ec20ef62c327feaada: Status 404 returned error can't find the container with id 26ebfa0ebbe81f713a240b99e0b1e31b873d3bd6eeff01ec20ef62c327feaada Apr 16 19:54:21.455229 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.455133 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:21.455388 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.455318 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:21.455388 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.455318 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:21.455388 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.455374 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:21.455388 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.455389 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cc4c694b8-2m66w: secret "image-registry-tls" not found Apr 16 19:54:21.455588 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.455415 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls podName:133663ab-a7a5-4f8a-8659-5dcb18604eed nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.455399694 +0000 UTC m=+48.231976684 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls") pod "dns-default-dtqvs" (UID: "133663ab-a7a5-4f8a-8659-5dcb18604eed") : secret "dns-default-metrics-tls" not found Apr 16 19:54:21.455588 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.455440 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls podName:7b325581-2334-4c35-ade9-6b22c9297769 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.455420859 +0000 UTC m=+48.231997837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls") pod "image-registry-7cc4c694b8-2m66w" (UID: "7b325581-2334-4c35-ade9-6b22c9297769") : secret "image-registry-tls" not found Apr 16 19:54:21.470499 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.470478 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6hmqh_fdcd8eab-4705-4045-bbc6-5974072ac6dd/dns-node-resolver/0.log" Apr 16 19:54:21.556547 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.556521 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:21.556702 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.556685 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:21.556753 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.556745 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert podName:02f069bd-5606-4bac-9784-8646fdf8c979 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.556729951 +0000 UTC m=+48.333306932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert") pod "ingress-canary-8q4g7" (UID: "02f069bd-5606-4bac-9784-8646fdf8c979") : secret "canary-serving-cert" not found Apr 16 19:54:21.657965 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:21.657926 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:21.658140 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.658119 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:21.658215 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:21.658202 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls podName:0200234c-4441-4ee0-a6b1-e543a08da9b8 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:23.65818007 +0000 UTC m=+42.434757068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9tsr8" (UID: "0200234c-4441-4ee0-a6b1-e543a08da9b8") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:22.057225 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:22.057183 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" event={"ID":"1ad38f3c-cccf-4846-8c4c-864918ab774f","Type":"ContainerStarted","Data":"d2c9579c571e33470d0f192c675be3e45201fa14a606505d02edfa22d579114f"} Apr 16 19:54:22.058333 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:22.058295 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-stfgk" event={"ID":"99245d8b-f916-4b7b-907b-c110d66c41c9","Type":"ContainerStarted","Data":"26ebfa0ebbe81f713a240b99e0b1e31b873d3bd6eeff01ec20ef62c327feaada"} Apr 16 19:54:22.083032 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:22.082984 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6h88" podStartSLOduration=2.317340727 podStartE2EDuration="5.082966919s" podCreationTimestamp="2026-04-16 19:54:17 +0000 UTC" firstStartedPulling="2026-04-16 19:54:18.116760996 +0000 UTC m=+36.893337977" lastFinishedPulling="2026-04-16 19:54:20.882387178 +0000 UTC m=+39.658964169" observedRunningTime="2026-04-16 19:54:22.081718447 +0000 UTC m=+40.858295447" watchObservedRunningTime="2026-04-16 19:54:22.082966919 +0000 UTC m=+40.859543920" Apr 16 19:54:22.671800 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:22.671772 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nm7vd_6eefa0ff-7de4-4c45-af84-a83e70151ad6/node-ca/0.log" Apr 16 19:54:23.472918 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:23.472890 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d6h88_1ad38f3c-cccf-4846-8c4c-864918ab774f/migrator/0.log" Apr 16 19:54:23.670340 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:23.670308 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d6h88_1ad38f3c-cccf-4846-8c4c-864918ab774f/graceful-termination/0.log" Apr 16 19:54:23.670481 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:23.670439 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:23.670663 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:23.670644 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:23.670730 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:23.670718 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls podName:0200234c-4441-4ee0-a6b1-e543a08da9b8 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.670701092 +0000 UTC m=+46.447278078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9tsr8" (UID: "0200234c-4441-4ee0-a6b1-e543a08da9b8") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:24.063486 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:24.063405 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-stfgk" event={"ID":"99245d8b-f916-4b7b-907b-c110d66c41c9","Type":"ContainerStarted","Data":"47ef9c19e95a48435b0b82d3fa19cdfa9c9e9c364287749d384098445949ec18"} Apr 16 19:54:24.079588 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:24.079544 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-stfgk" podStartSLOduration=1.97968785 podStartE2EDuration="4.079530702s" podCreationTimestamp="2026-04-16 19:54:20 +0000 UTC" firstStartedPulling="2026-04-16 19:54:21.234451537 +0000 UTC m=+40.011028518" lastFinishedPulling="2026-04-16 19:54:23.334294394 +0000 UTC m=+42.110871370" observedRunningTime="2026-04-16 19:54:24.078323384 +0000 UTC m=+42.854900383" watchObservedRunningTime="2026-04-16 19:54:24.079530702 +0000 UTC m=+42.856107701" Apr 16 19:54:27.701330 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:27.701295 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:27.701774 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:27.701409 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:54:27.701774 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:54:27.701471 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls podName:0200234c-4441-4ee0-a6b1-e543a08da9b8 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:35.70145681 +0000 UTC m=+54.478033787 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-9tsr8" (UID: "0200234c-4441-4ee0-a6b1-e543a08da9b8") : secret "insights-runtime-extractor-tls" not found Apr 16 19:54:29.214073 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.214035 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:29.216383 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.216348 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e922f21a-2e9d-4d74-9bbf-9f154ed71518-original-pull-secret\") pod \"global-pull-secret-syncer-vpg7x\" (UID: \"e922f21a-2e9d-4d74-9bbf-9f154ed71518\") " pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:29.403612 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.403568 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vpg7x" Apr 16 19:54:29.516320 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.516292 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:29.516444 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.516346 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:29.518466 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.518441 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"image-registry-7cc4c694b8-2m66w\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:29.527722 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.527699 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/133663ab-a7a5-4f8a-8659-5dcb18604eed-metrics-tls\") pod \"dns-default-dtqvs\" (UID: \"133663ab-a7a5-4f8a-8659-5dcb18604eed\") " pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:29.543379 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.543357 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vpg7x"] Apr 16 19:54:29.546751 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:29.546725 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode922f21a_2e9d_4d74_9bbf_9f154ed71518.slice/crio-465b37c903a635f85161ab5b04c00f5df21099643c9250576809a6e428aa583c WatchSource:0}: Error finding container 465b37c903a635f85161ab5b04c00f5df21099643c9250576809a6e428aa583c: Status 404 returned error can't find the container with id 465b37c903a635f85161ab5b04c00f5df21099643c9250576809a6e428aa583c Apr 16 19:54:29.563486 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.563466 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:29.617583 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.617557 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:29.618352 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.618332 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:29.620492 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.620443 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02f069bd-5606-4bac-9784-8646fdf8c979-cert\") pod \"ingress-canary-8q4g7\" (UID: \"02f069bd-5606-4bac-9784-8646fdf8c979\") " pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:29.638505 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.638471 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8q4g7" Apr 16 19:54:29.692955 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.692927 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cc4c694b8-2m66w"] Apr 16 19:54:29.695444 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:29.695414 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b325581_2334_4c35_ade9_6b22c9297769.slice/crio-8c85b8e10db10ede70e17f7002b6c8980ac18c9cf03bd0d6fee53213ae353a26 WatchSource:0}: Error finding container 8c85b8e10db10ede70e17f7002b6c8980ac18c9cf03bd0d6fee53213ae353a26: Status 404 returned error can't find the container with id 8c85b8e10db10ede70e17f7002b6c8980ac18c9cf03bd0d6fee53213ae353a26 Apr 16 19:54:29.754692 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.754664 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dtqvs"] Apr 16 19:54:29.767642 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:29.767592 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133663ab_a7a5_4f8a_8659_5dcb18604eed.slice/crio-67706e04ebbbcfaf13523a7590431b96cce702ef5ab3d1badc4bfd207fda76bc WatchSource:0}: Error finding container 67706e04ebbbcfaf13523a7590431b96cce702ef5ab3d1badc4bfd207fda76bc: Status 404 returned error can't find the container with id 67706e04ebbbcfaf13523a7590431b96cce702ef5ab3d1badc4bfd207fda76bc Apr 16 19:54:29.775466 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:29.775441 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8q4g7"] Apr 16 19:54:29.777724 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:29.777692 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f069bd_5606_4bac_9784_8646fdf8c979.slice/crio-c033ca46b0089a7919cc9d33829ca4468fec1cecc4cdfb8c2ab26b410899cd47 WatchSource:0}: Error finding container c033ca46b0089a7919cc9d33829ca4468fec1cecc4cdfb8c2ab26b410899cd47: Status 404 returned error can't find the container with id c033ca46b0089a7919cc9d33829ca4468fec1cecc4cdfb8c2ab26b410899cd47 Apr 16 19:54:30.079429 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:30.079339 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8q4g7" event={"ID":"02f069bd-5606-4bac-9784-8646fdf8c979","Type":"ContainerStarted","Data":"c033ca46b0089a7919cc9d33829ca4468fec1cecc4cdfb8c2ab26b410899cd47"} Apr 16 19:54:30.080662 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:30.080625 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dtqvs" event={"ID":"133663ab-a7a5-4f8a-8659-5dcb18604eed","Type":"ContainerStarted","Data":"67706e04ebbbcfaf13523a7590431b96cce702ef5ab3d1badc4bfd207fda76bc"} Apr 16 19:54:30.081731 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:30.081700 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vpg7x" event={"ID":"e922f21a-2e9d-4d74-9bbf-9f154ed71518","Type":"ContainerStarted","Data":"465b37c903a635f85161ab5b04c00f5df21099643c9250576809a6e428aa583c"} Apr 16 19:54:30.083115 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:30.083067 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" event={"ID":"7b325581-2334-4c35-ade9-6b22c9297769","Type":"ContainerStarted","Data":"7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8"} Apr 16 19:54:30.083115 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:30.083098 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" event={"ID":"7b325581-2334-4c35-ade9-6b22c9297769","Type":"ContainerStarted","Data":"8c85b8e10db10ede70e17f7002b6c8980ac18c9cf03bd0d6fee53213ae353a26"} Apr 16 19:54:30.083279 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:30.083202 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:30.116187 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:30.116113 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" podStartSLOduration=31.116094655 podStartE2EDuration="31.116094655s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:30.114638349 +0000 UTC m=+48.891215348" watchObservedRunningTime="2026-04-16 19:54:30.116094655 +0000 UTC m=+48.892671646" Apr 16 19:54:34.099021 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:34.098984 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8q4g7" event={"ID":"02f069bd-5606-4bac-9784-8646fdf8c979","Type":"ContainerStarted","Data":"e07a9ac90bce32c2e91a272f06cbfd3d38e0608661d01b1ef5f2fc9876395a16"} Apr 16 19:54:34.101033 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:34.101005 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dtqvs" event={"ID":"133663ab-a7a5-4f8a-8659-5dcb18604eed","Type":"ContainerStarted","Data":"fda778d51eb85124bee94eed51f4cc5c8804fe7c70770d859ac9ea10300788bf"} Apr 16 19:54:34.102517 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:34.102495 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vpg7x" event={"ID":"e922f21a-2e9d-4d74-9bbf-9f154ed71518","Type":"ContainerStarted","Data":"1f24d8f6095154d1cac618762232eb67a246d9217b328be4c341343fdeebee1f"} Apr 16 19:54:34.119387 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:34.119339 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8q4g7" podStartSLOduration=16.975604656 podStartE2EDuration="21.119323041s" podCreationTimestamp="2026-04-16 19:54:13 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.779536331 +0000 UTC m=+48.556113308" lastFinishedPulling="2026-04-16 19:54:33.923254712 +0000 UTC m=+52.699831693" observedRunningTime="2026-04-16 19:54:34.118668622 +0000 UTC m=+52.895245624" watchObservedRunningTime="2026-04-16 19:54:34.119323041 +0000 UTC m=+52.895900044" Apr 16 19:54:35.107239 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.107202 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dtqvs" event={"ID":"133663ab-a7a5-4f8a-8659-5dcb18604eed","Type":"ContainerStarted","Data":"a5b24fb392dffc3613b91312fa64b20d4aff43bdd1d642afbcd07e2ee84b87b3"} Apr 16 19:54:35.107710 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.107517 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:35.124543 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.124501 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vpg7x" podStartSLOduration=33.739558503 podStartE2EDuration="38.124488879s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.54829964 +0000 UTC m=+48.324876617" lastFinishedPulling="2026-04-16 19:54:33.933230013 +0000 UTC m=+52.709806993" observedRunningTime="2026-04-16 19:54:34.147870844 +0000 UTC m=+52.924447843" watchObservedRunningTime="2026-04-16 19:54:35.124488879 +0000 UTC m=+53.901065877" Apr 16 19:54:35.124879 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.124848 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dtqvs" podStartSLOduration=17.972975085 podStartE2EDuration="22.124840031s" podCreationTimestamp="2026-04-16 19:54:13 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.769457905 +0000 UTC m=+48.546034882" lastFinishedPulling="2026-04-16 19:54:33.921322849 +0000 UTC m=+52.697899828" observedRunningTime="2026-04-16 19:54:35.124001941 +0000 UTC m=+53.900578940" watchObservedRunningTime="2026-04-16 19:54:35.124840031 +0000 UTC m=+53.901417029" Apr 16 19:54:35.769993 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.769956 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:35.772249 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.772218 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0200234c-4441-4ee0-a6b1-e543a08da9b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9tsr8\" (UID: \"0200234c-4441-4ee0-a6b1-e543a08da9b8\") " pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:35.812060 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.812039 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9tsr8" Apr 16 19:54:35.929638 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:35.929592 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9tsr8"] Apr 16 19:54:35.935082 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:35.935052 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0200234c_4441_4ee0_a6b1_e543a08da9b8.slice/crio-f3092fce8d45c440f448408c7fb93303ffb7107d5df44541d885bdda5eb703c3 WatchSource:0}: Error finding container f3092fce8d45c440f448408c7fb93303ffb7107d5df44541d885bdda5eb703c3: Status 404 returned error can't find the container with id f3092fce8d45c440f448408c7fb93303ffb7107d5df44541d885bdda5eb703c3 Apr 16 19:54:36.111921 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:36.111839 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9tsr8" event={"ID":"0200234c-4441-4ee0-a6b1-e543a08da9b8","Type":"ContainerStarted","Data":"2407a4146375c88b974a74e6dffdcdae5210d08bb7983a64849d2674b8aa89ce"} Apr 16 19:54:36.111921 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:36.111875 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9tsr8" event={"ID":"0200234c-4441-4ee0-a6b1-e543a08da9b8","Type":"ContainerStarted","Data":"f3092fce8d45c440f448408c7fb93303ffb7107d5df44541d885bdda5eb703c3"} Apr 16 19:54:37.116761 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:37.116732 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9tsr8" event={"ID":"0200234c-4441-4ee0-a6b1-e543a08da9b8","Type":"ContainerStarted","Data":"7006fc936ca568a885255228529f193f90ff347d676d2c21d9c5d531ff86c671"} Apr 16 19:54:39.027898 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:39.027874 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8s7w4" Apr 16 19:54:39.123530 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:39.123497 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9tsr8" event={"ID":"0200234c-4441-4ee0-a6b1-e543a08da9b8","Type":"ContainerStarted","Data":"f53c35f3c7c676711229275ea9d8b384d7a4e98bbf83d53db1b30d5ac9737b80"} Apr 16 19:54:39.144055 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:39.144014 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9tsr8" podStartSLOduration=17.173030808 podStartE2EDuration="20.144000011s" podCreationTimestamp="2026-04-16 19:54:19 +0000 UTC" firstStartedPulling="2026-04-16 19:54:35.995937503 +0000 UTC m=+54.772514480" lastFinishedPulling="2026-04-16 19:54:38.966906706 +0000 UTC m=+57.743483683" observedRunningTime="2026-04-16 19:54:39.142950653 +0000 UTC m=+57.919527652" watchObservedRunningTime="2026-04-16 19:54:39.144000011 +0000 UTC m=+57.920577055" Apr 16 19:54:45.114488 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.114456 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dtqvs" Apr 16 19:54:45.242716 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.242683 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb"] Apr 16 19:54:45.248158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.248136 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-vjnr7"] Apr 16 19:54:45.248305 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.248287 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" Apr 16 19:54:45.251659 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.251640 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-vjnr7" Apr 16 19:54:45.252300 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.252275 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-82vbg\"" Apr 16 19:54:45.253757 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.253738 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 19:54:45.257836 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.257818 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:54:45.262513 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.262494 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-fprmv\"" Apr 16 19:54:45.266420 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.266402 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:54:45.271403 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.271384 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-vjnr7"] Apr 16 19:54:45.312028 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.312001 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7cc4c694b8-2m66w"] Apr 16 19:54:45.316148 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.316126 2563 patch_prober.go:28] interesting pod/image-registry-7cc4c694b8-2m66w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 19:54:45.316238 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.316164 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" podUID="7b325581-2334-4c35-ade9-6b22c9297769" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:54:45.334975 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.334932 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb"] Apr 16 19:54:45.335500 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.335479 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgjd\" (UniqueName: \"kubernetes.io/projected/9f502296-7bae-46a2-93aa-fb3effb2035b-kube-api-access-wcgjd\") pod \"downloads-6bcc868b7-vjnr7\" (UID: \"9f502296-7bae-46a2-93aa-fb3effb2035b\") " pod="openshift-console/downloads-6bcc868b7-vjnr7" Apr 16 19:54:45.335567 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.335514 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/abe80e33-639d-48ee-a5c7-d2a276c94434-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-d2rzb\" (UID: \"abe80e33-639d-48ee-a5c7-d2a276c94434\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" Apr 16 19:54:45.376822 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.376796 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b8bff98b5-kb9d6"] Apr 16 19:54:45.384182 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.384163 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.409168 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.409148 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b8bff98b5-kb9d6"] Apr 16 19:54:45.435828 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.435809 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a79633d3-d516-48b7-b2f8-86c85d66903a-registry-certificates\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.435922 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.435836 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-registry-tls\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.435922 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.435869 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgjd\" (UniqueName: \"kubernetes.io/projected/9f502296-7bae-46a2-93aa-fb3effb2035b-kube-api-access-wcgjd\") pod \"downloads-6bcc868b7-vjnr7\" (UID: \"9f502296-7bae-46a2-93aa-fb3effb2035b\") " pod="openshift-console/downloads-6bcc868b7-vjnr7" Apr 16 19:54:45.435922 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.435913 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/abe80e33-639d-48ee-a5c7-d2a276c94434-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-d2rzb\" (UID: \"abe80e33-639d-48ee-a5c7-d2a276c94434\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" Apr 16 19:54:45.436023 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.435962 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-bound-sa-token\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.436023 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.435998 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a79633d3-d516-48b7-b2f8-86c85d66903a-installation-pull-secrets\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.436098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.436029 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a79633d3-d516-48b7-b2f8-86c85d66903a-image-registry-private-configuration\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.436098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.436054 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a79633d3-d516-48b7-b2f8-86c85d66903a-ca-trust-extracted\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.436098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.436083 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a79633d3-d516-48b7-b2f8-86c85d66903a-trusted-ca\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.436189 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.436118 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjrf4\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-kube-api-access-jjrf4\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.438282 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.438261 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/abe80e33-639d-48ee-a5c7-d2a276c94434-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-d2rzb\" (UID: \"abe80e33-639d-48ee-a5c7-d2a276c94434\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" Apr 16 19:54:45.463374 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.463351 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgjd\" (UniqueName: \"kubernetes.io/projected/9f502296-7bae-46a2-93aa-fb3effb2035b-kube-api-access-wcgjd\") pod \"downloads-6bcc868b7-vjnr7\" (UID: \"9f502296-7bae-46a2-93aa-fb3effb2035b\") " pod="openshift-console/downloads-6bcc868b7-vjnr7" Apr 16 19:54:45.537206 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537185 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjrf4\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-kube-api-access-jjrf4\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537297 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537219 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a79633d3-d516-48b7-b2f8-86c85d66903a-registry-certificates\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537297 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537238 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-registry-tls\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537297 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537283 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-bound-sa-token\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537424 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537311 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a79633d3-d516-48b7-b2f8-86c85d66903a-installation-pull-secrets\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537424 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537343 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a79633d3-d516-48b7-b2f8-86c85d66903a-image-registry-private-configuration\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537424 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537367 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a79633d3-d516-48b7-b2f8-86c85d66903a-ca-trust-extracted\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537572 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537396 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a79633d3-d516-48b7-b2f8-86c85d66903a-trusted-ca\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.537836 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.537802 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a79633d3-d516-48b7-b2f8-86c85d66903a-ca-trust-extracted\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.538149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.538117 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a79633d3-d516-48b7-b2f8-86c85d66903a-registry-certificates\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.538298 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.538278 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a79633d3-d516-48b7-b2f8-86c85d66903a-trusted-ca\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.539741 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.539714 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a79633d3-d516-48b7-b2f8-86c85d66903a-installation-pull-secrets\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.539825 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.539754 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a79633d3-d516-48b7-b2f8-86c85d66903a-image-registry-private-configuration\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.539949 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.539932 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-registry-tls\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.551359 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.551339 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-bound-sa-token\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.552033 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.552010 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjrf4\" (UniqueName: \"kubernetes.io/projected/a79633d3-d516-48b7-b2f8-86c85d66903a-kube-api-access-jjrf4\") pod \"image-registry-5b8bff98b5-kb9d6\" (UID: \"a79633d3-d516-48b7-b2f8-86c85d66903a\") " pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.559676 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.559657 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" Apr 16 19:54:45.564354 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.564333 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-vjnr7" Apr 16 19:54:45.684365 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.684342 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb"] Apr 16 19:54:45.686284 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:45.686259 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe80e33_639d_48ee_a5c7_d2a276c94434.slice/crio-27bdd9ad94012ffbfeb3db1d85813b6f204c92c544fbcc35b066d7636a20d0cb WatchSource:0}: Error finding container 27bdd9ad94012ffbfeb3db1d85813b6f204c92c544fbcc35b066d7636a20d0cb: Status 404 returned error can't find the container with id 27bdd9ad94012ffbfeb3db1d85813b6f204c92c544fbcc35b066d7636a20d0cb Apr 16 19:54:45.692734 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.692712 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:45.703254 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.703235 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-vjnr7"] Apr 16 19:54:45.705305 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:45.705277 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f502296_7bae_46a2_93aa_fb3effb2035b.slice/crio-a8b598178f75cdefa3fbb44f647b91881711c68f90f9001b7e351117ec310a30 WatchSource:0}: Error finding container a8b598178f75cdefa3fbb44f647b91881711c68f90f9001b7e351117ec310a30: Status 404 returned error can't find the container with id a8b598178f75cdefa3fbb44f647b91881711c68f90f9001b7e351117ec310a30 Apr 16 19:54:45.817938 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:45.817913 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b8bff98b5-kb9d6"] Apr 16 19:54:45.822548 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:45.822525 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79633d3_d516_48b7_b2f8_86c85d66903a.slice/crio-d5f93cad3cfe7948e58f3b041e4d13aba25b1af1736f108c059ad3eaa290f2db WatchSource:0}: Error finding container d5f93cad3cfe7948e58f3b041e4d13aba25b1af1736f108c059ad3eaa290f2db: Status 404 returned error can't find the container with id d5f93cad3cfe7948e58f3b041e4d13aba25b1af1736f108c059ad3eaa290f2db Apr 16 19:54:46.142122 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.142040 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" event={"ID":"a79633d3-d516-48b7-b2f8-86c85d66903a","Type":"ContainerStarted","Data":"e8cd47c434cc93884fbb59408763a7b8a8deb799b44c891f697891edff50c575"} Apr 16 19:54:46.142122 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.142083 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" event={"ID":"a79633d3-d516-48b7-b2f8-86c85d66903a","Type":"ContainerStarted","Data":"d5f93cad3cfe7948e58f3b041e4d13aba25b1af1736f108c059ad3eaa290f2db"} Apr 16 19:54:46.142643 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.142175 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:54:46.143294 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.143265 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-vjnr7" event={"ID":"9f502296-7bae-46a2-93aa-fb3effb2035b","Type":"ContainerStarted","Data":"a8b598178f75cdefa3fbb44f647b91881711c68f90f9001b7e351117ec310a30"} Apr 16 19:54:46.144278 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.144260 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" event={"ID":"abe80e33-639d-48ee-a5c7-d2a276c94434","Type":"ContainerStarted","Data":"27bdd9ad94012ffbfeb3db1d85813b6f204c92c544fbcc35b066d7636a20d0cb"} Apr 16 19:54:46.166079 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.166030 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" podStartSLOduration=1.166013894 podStartE2EDuration="1.166013894s" podCreationTimestamp="2026-04-16 19:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:46.164774954 +0000 UTC m=+64.941351950" watchObservedRunningTime="2026-04-16 19:54:46.166013894 +0000 UTC m=+64.942590894" Apr 16 19:54:46.546713 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.546629 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:46.549336 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.549298 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:46.560318 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.560291 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81d750f7-8363-48b6-afd3-9847607883b7-metrics-certs\") pod \"network-metrics-daemon-p54df\" (UID: \"81d750f7-8363-48b6-afd3-9847607883b7\") " pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:46.607616 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.607572 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2xst7\"" Apr 16 19:54:46.616080 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.616057 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p54df" Apr 16 19:54:46.753461 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:46.753424 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p54df"] Apr 16 19:54:46.756849 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:46.756820 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d750f7_8363_48b6_afd3_9847607883b7.slice/crio-c7ec386c20149659e1473fdd499defe05153d1ee3cd96a3462b6bee11604ce69 WatchSource:0}: Error finding container c7ec386c20149659e1473fdd499defe05153d1ee3cd96a3462b6bee11604ce69: Status 404 returned error can't find the container with id c7ec386c20149659e1473fdd499defe05153d1ee3cd96a3462b6bee11604ce69 Apr 16 19:54:47.147966 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:47.147928 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p54df" event={"ID":"81d750f7-8363-48b6-afd3-9847607883b7","Type":"ContainerStarted","Data":"c7ec386c20149659e1473fdd499defe05153d1ee3cd96a3462b6bee11604ce69"} Apr 16 19:54:48.155821 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:48.155755 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" event={"ID":"abe80e33-639d-48ee-a5c7-d2a276c94434","Type":"ContainerStarted","Data":"60f48a26bff458e64f9f82c7645028de9ca3b87c351673c3b3d35ab065210535"} Apr 16 19:54:48.156284 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:48.156159 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" Apr 16 19:54:48.161872 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:48.161848 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" Apr 16 19:54:48.175208 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:48.175152 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-d2rzb" podStartSLOduration=1.345385874 podStartE2EDuration="3.175135925s" podCreationTimestamp="2026-04-16 19:54:45 +0000 UTC" firstStartedPulling="2026-04-16 19:54:45.688226599 +0000 UTC m=+64.464803575" lastFinishedPulling="2026-04-16 19:54:47.517976631 +0000 UTC m=+66.294553626" observedRunningTime="2026-04-16 19:54:48.173808807 +0000 UTC m=+66.950385806" watchObservedRunningTime="2026-04-16 19:54:48.175135925 +0000 UTC m=+66.951712924" Apr 16 19:54:49.164885 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:49.164752 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p54df" event={"ID":"81d750f7-8363-48b6-afd3-9847607883b7","Type":"ContainerStarted","Data":"8564dddee25426871416e4e26b479e694b3b725ea1774415f44809d699df8b87"} Apr 16 19:54:49.164885 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:49.164812 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p54df" event={"ID":"81d750f7-8363-48b6-afd3-9847607883b7","Type":"ContainerStarted","Data":"bea95e99ae32c7d6fbe0939a6c62444f9b7b3056460cef3213a0f9e81b2cc716"} Apr 16 19:54:49.198650 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:49.198571 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p54df" podStartSLOduration=66.375671221 podStartE2EDuration="1m8.198550407s" podCreationTimestamp="2026-04-16 19:53:41 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.759473681 +0000 UTC m=+65.536050658" lastFinishedPulling="2026-04-16 19:54:48.582352863 +0000 UTC m=+67.358929844" observedRunningTime="2026-04-16 19:54:49.196967804 +0000 UTC m=+67.973544817" watchObservedRunningTime="2026-04-16 19:54:49.198550407 +0000 UTC m=+67.975127406" Apr 16 19:54:52.062051 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:52.062013 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-chnql" Apr 16 19:54:54.733672 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.733636 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5wkvv"] Apr 16 19:54:54.738926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.738903 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.742485 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.742321 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:54:54.742485 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.742339 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:54:54.742675 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.742665 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rv4cw\"" Apr 16 19:54:54.742983 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.742960 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:54:54.743099 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.743021 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:54:54.743099 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.742968 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:54:54.746674 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.746650 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:54:54.807374 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807350 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-tls\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807500 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807405 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/519fe37e-7387-44fb-bec2-9430b0c20e29-metrics-client-ca\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807500 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807463 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-accelerators-collector-config\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807632 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807514 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-textfile\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807632 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807543 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807735 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807641 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-sys\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807735 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807669 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-wtmp\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807735 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807695 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cbnm\" (UniqueName: \"kubernetes.io/projected/519fe37e-7387-44fb-bec2-9430b0c20e29-kube-api-access-7cbnm\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.807842 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.807759 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-root\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909093 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909057 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-textfile\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909093 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909087 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909290 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909115 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-sys\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909290 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909138 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-wtmp\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909290 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909156 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cbnm\" (UniqueName: \"kubernetes.io/projected/519fe37e-7387-44fb-bec2-9430b0c20e29-kube-api-access-7cbnm\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909290 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909184 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-root\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909290 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909236 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-sys\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909290 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909284 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-tls\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909332 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-wtmp\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909335 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/519fe37e-7387-44fb-bec2-9430b0c20e29-metrics-client-ca\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909389 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-accelerators-collector-config\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.909620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.909527 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/519fe37e-7387-44fb-bec2-9430b0c20e29-root\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.910191 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.910167 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-textfile\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.910618 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.910582 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/519fe37e-7387-44fb-bec2-9430b0c20e29-metrics-client-ca\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.910726 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.910652 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-accelerators-collector-config\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.912952 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.912917 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.913029 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.912991 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/519fe37e-7387-44fb-bec2-9430b0c20e29-node-exporter-tls\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:54.922471 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:54.922453 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cbnm\" (UniqueName: \"kubernetes.io/projected/519fe37e-7387-44fb-bec2-9430b0c20e29-kube-api-access-7cbnm\") pod \"node-exporter-5wkvv\" (UID: \"519fe37e-7387-44fb-bec2-9430b0c20e29\") " pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:55.050684 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:55.050593 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5wkvv" Apr 16 19:54:55.060305 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:54:55.060277 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519fe37e_7387_44fb_bec2_9430b0c20e29.slice/crio-a6222dc2c8fe385083e1b6a4eea0acdf2a91c88ef0f6e9422a87a6727c235750 WatchSource:0}: Error finding container a6222dc2c8fe385083e1b6a4eea0acdf2a91c88ef0f6e9422a87a6727c235750: Status 404 returned error can't find the container with id a6222dc2c8fe385083e1b6a4eea0acdf2a91c88ef0f6e9422a87a6727c235750 Apr 16 19:54:55.184561 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:55.184530 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wkvv" event={"ID":"519fe37e-7387-44fb-bec2-9430b0c20e29","Type":"ContainerStarted","Data":"a6222dc2c8fe385083e1b6a4eea0acdf2a91c88ef0f6e9422a87a6727c235750"} Apr 16 19:54:55.317669 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:55.317580 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:54:56.188191 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:56.188160 2563 generic.go:358] "Generic (PLEG): container finished" podID="519fe37e-7387-44fb-bec2-9430b0c20e29" containerID="bb5c3d3b98e2aebfee2dd1457613a46cf1fa645a82af0aea470fea09827d21ad" exitCode=0 Apr 16 19:54:56.188635 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:56.188220 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wkvv" event={"ID":"519fe37e-7387-44fb-bec2-9430b0c20e29","Type":"ContainerDied","Data":"bb5c3d3b98e2aebfee2dd1457613a46cf1fa645a82af0aea470fea09827d21ad"} Apr 16 19:54:57.193277 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:57.193242 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wkvv" event={"ID":"519fe37e-7387-44fb-bec2-9430b0c20e29","Type":"ContainerStarted","Data":"5dad0ecb1a06d0aaf3e515892c76116a4c5857275b1a23fcfd5c5be3daeb50e8"} Apr 16 19:54:57.193709 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:57.193285 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wkvv" event={"ID":"519fe37e-7387-44fb-bec2-9430b0c20e29","Type":"ContainerStarted","Data":"33e05aa9d8f8afcdb016a9806e3acb3e485d3dd03d4f9d79096497f24ee164f6"} Apr 16 19:54:57.218954 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:54:57.218897 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5wkvv" podStartSLOduration=2.321381618 podStartE2EDuration="3.21888373s" podCreationTimestamp="2026-04-16 19:54:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:55.062374894 +0000 UTC m=+73.838951874" lastFinishedPulling="2026-04-16 19:54:55.959876991 +0000 UTC m=+74.736453986" observedRunningTime="2026-04-16 19:54:57.216899208 +0000 UTC m=+75.993476220" watchObservedRunningTime="2026-04-16 19:54:57.21888373 +0000 UTC m=+75.995460729" Apr 16 19:55:00.896444 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.896308 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:00.901825 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.901791 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.907952 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.907925 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:55:00.909629 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.909590 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:55:00.909747 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.909643 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:55:00.909747 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.909642 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:55:00.909747 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.909692 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:55:00.909747 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.909696 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:55:00.909941 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.909819 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ahu2531gpgd6s\"" Apr 16 19:55:00.910059 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.910041 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:55:00.910128 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.910101 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-r4mw4\"" Apr 16 19:55:00.910191 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.910156 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:55:00.910250 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.910226 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:55:00.910391 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.910372 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:55:00.910617 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.910587 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:55:00.910772 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.910753 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:55:00.914443 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.914425 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:55:00.917694 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.917674 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:00.957573 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957543 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957687 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957640 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957687 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957668 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957794 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957699 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config-out\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957794 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957728 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957794 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957756 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957933 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957816 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957933 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957845 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.957933 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957902 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sftns\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-kube-api-access-sftns\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958083 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957939 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958083 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.957981 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958083 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.958043 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958083 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.958070 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-web-config\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958250 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.958132 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958250 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.958166 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958250 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.958197 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958250 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.958222 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:00.958395 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:00.958289 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059245 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059214 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059423 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059263 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059423 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059282 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059423 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059405 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftns\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-kube-api-access-sftns\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059582 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059453 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059582 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059480 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059582 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059518 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059582 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059542 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-web-config\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059804 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059582 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059804 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059627 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059804 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059657 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059804 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059685 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.059804 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059713 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.060060 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059880 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.060060 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059956 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.060060 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.059990 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.060060 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.060028 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config-out\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.060060 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.060057 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.060296 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.060074 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.061726 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.061072 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.063673 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.063648 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.064427 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.064368 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.064547 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.064522 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.064644 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.064548 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.064644 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.064558 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.065093 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.065072 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config-out\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.065238 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.065217 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.065367 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.065339 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.065482 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.065419 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.065832 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.065809 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-web-config\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.066214 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.066007 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.066214 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.066080 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.066390 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.066369 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.066914 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.066893 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.067221 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.067199 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.085266 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.085243 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sftns\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-kube-api-access-sftns\") pod \"prometheus-k8s-0\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:01.213248 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:01.213162 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:04.874047 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:04.874020 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:05.218795 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:05.218750 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-vjnr7" event={"ID":"9f502296-7bae-46a2-93aa-fb3effb2035b","Type":"ContainerStarted","Data":"e2d75031c12adf92e24938142ddc1c8397a0513c709728106f85f0c66c136133"} Apr 16 19:55:05.219103 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:05.219074 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-vjnr7" Apr 16 19:55:05.220143 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:05.220120 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerStarted","Data":"cfa396a4339c41bebde23b212600ec5fb52bfed2f7eee50bd8cbc39e15d70f54"} Apr 16 19:55:05.232517 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:05.232493 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-vjnr7" Apr 16 19:55:05.236387 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:05.236348 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-vjnr7" podStartSLOduration=1.106436197 podStartE2EDuration="20.236335484s" podCreationTimestamp="2026-04-16 19:54:45 +0000 UTC" firstStartedPulling="2026-04-16 19:54:45.706959834 +0000 UTC m=+64.483536811" lastFinishedPulling="2026-04-16 19:55:04.836859105 +0000 UTC m=+83.613436098" observedRunningTime="2026-04-16 19:55:05.235467681 +0000 UTC m=+84.012044681" watchObservedRunningTime="2026-04-16 19:55:05.236335484 +0000 UTC m=+84.012912521" Apr 16 19:55:06.225663 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:06.225553 2563 generic.go:358] "Generic (PLEG): container finished" podID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" exitCode=0 Apr 16 19:55:06.225663 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:06.225645 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78"} Apr 16 19:55:07.153236 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:07.153207 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b8bff98b5-kb9d6" Apr 16 19:55:10.241325 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.241282 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerStarted","Data":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} Apr 16 19:55:10.241325 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.241327 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerStarted","Data":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} Apr 16 19:55:10.330570 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.330507 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" podUID="7b325581-2334-4c35-ade9-6b22c9297769" containerName="registry" containerID="cri-o://7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8" gracePeriod=30 Apr 16 19:55:10.602029 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.602000 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:55:10.745830 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.745747 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl5xl\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-kube-api-access-gl5xl\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.745830 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.745815 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-registry-certificates\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.746054 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.745992 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.746054 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.746042 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-bound-sa-token\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.746160 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.746074 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-image-registry-private-configuration\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.746214 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.746179 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b325581-2334-4c35-ade9-6b22c9297769-ca-trust-extracted\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.746214 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.746191 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:10.746312 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.746254 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-installation-pull-secrets\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.746312 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.746290 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-trusted-ca\") pod \"7b325581-2334-4c35-ade9-6b22c9297769\" (UID: \"7b325581-2334-4c35-ade9-6b22c9297769\") " Apr 16 19:55:10.746544 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.746522 2563 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-registry-certificates\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:10.747318 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.747285 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:10.748893 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.748826 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-kube-api-access-gl5xl" (OuterVolumeSpecName: "kube-api-access-gl5xl") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "kube-api-access-gl5xl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:10.748893 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.748848 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:10.749016 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.748984 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:10.749490 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.749454 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:10.751116 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.751091 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:10.757936 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.757909 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b325581-2334-4c35-ade9-6b22c9297769-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7b325581-2334-4c35-ade9-6b22c9297769" (UID: "7b325581-2334-4c35-ade9-6b22c9297769"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:55:10.848099 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.848067 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gl5xl\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-kube-api-access-gl5xl\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:10.848099 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.848104 2563 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-registry-tls\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:10.848448 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.848116 2563 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b325581-2334-4c35-ade9-6b22c9297769-bound-sa-token\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:10.848448 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.848133 2563 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-image-registry-private-configuration\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:10.848448 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.848148 2563 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b325581-2334-4c35-ade9-6b22c9297769-ca-trust-extracted\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:10.848448 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.848160 2563 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b325581-2334-4c35-ade9-6b22c9297769-installation-pull-secrets\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:10.848448 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:10.848172 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b325581-2334-4c35-ade9-6b22c9297769-trusted-ca\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:55:11.246122 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.245992 2563 generic.go:358] "Generic (PLEG): container finished" podID="7b325581-2334-4c35-ade9-6b22c9297769" containerID="7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8" exitCode=0 Apr 16 19:55:11.246122 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.246079 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" Apr 16 19:55:11.246122 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.246083 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" event={"ID":"7b325581-2334-4c35-ade9-6b22c9297769","Type":"ContainerDied","Data":"7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8"} Apr 16 19:55:11.246733 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.246131 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cc4c694b8-2m66w" event={"ID":"7b325581-2334-4c35-ade9-6b22c9297769","Type":"ContainerDied","Data":"8c85b8e10db10ede70e17f7002b6c8980ac18c9cf03bd0d6fee53213ae353a26"} Apr 16 19:55:11.246733 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.246154 2563 scope.go:117] "RemoveContainer" containerID="7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8" Apr 16 19:55:11.265157 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.265142 2563 scope.go:117] "RemoveContainer" containerID="7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8" Apr 16 19:55:11.265626 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:55:11.265546 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8\": container with ID starting with 7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8 not found: ID does not exist" containerID="7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8" Apr 16 19:55:11.265711 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.265585 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8"} err="failed to get container status \"7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8\": rpc error: code = NotFound desc = could not find container \"7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8\": container with ID starting with 7e52126742f49e88508b2f16e4a13c936313741c789daf0830001278c12ccfd8 not found: ID does not exist" Apr 16 19:55:11.283038 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.283017 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7cc4c694b8-2m66w"] Apr 16 19:55:11.287767 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.287741 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7cc4c694b8-2m66w"] Apr 16 19:55:11.799675 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:11.799637 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b325581-2334-4c35-ade9-6b22c9297769" path="/var/lib/kubelet/pods/7b325581-2334-4c35-ade9-6b22c9297769/volumes" Apr 16 19:55:13.257227 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:13.257193 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerStarted","Data":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} Apr 16 19:55:13.257780 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:13.257237 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerStarted","Data":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} Apr 16 19:55:13.257780 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:13.257253 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerStarted","Data":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} Apr 16 19:55:13.257780 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:13.257266 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerStarted","Data":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} Apr 16 19:55:13.290164 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:13.290107 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.805162878 podStartE2EDuration="13.290090352s" podCreationTimestamp="2026-04-16 19:55:00 +0000 UTC" firstStartedPulling="2026-04-16 19:55:04.876948999 +0000 UTC m=+83.653525975" lastFinishedPulling="2026-04-16 19:55:12.361876458 +0000 UTC m=+91.138453449" observedRunningTime="2026-04-16 19:55:13.287499736 +0000 UTC m=+92.064076760" watchObservedRunningTime="2026-04-16 19:55:13.290090352 +0000 UTC m=+92.066667352" Apr 16 19:55:16.213957 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:55:16.213907 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:01.213824 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:01.213787 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:01.232026 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:01.231995 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:01.402232 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:01.402207 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:19.325784 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.325742 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:19.326511 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.326304 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="prometheus" containerID="cri-o://a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" gracePeriod=600 Apr 16 19:56:19.326511 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.326341 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy" containerID="cri-o://ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" gracePeriod=600 Apr 16 19:56:19.326511 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.326375 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-web" containerID="cri-o://4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" gracePeriod=600 Apr 16 19:56:19.326511 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.326421 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="config-reloader" containerID="cri-o://15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" gracePeriod=600 Apr 16 19:56:19.326511 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.326371 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" gracePeriod=600 Apr 16 19:56:19.326821 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.326552 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="thanos-sidecar" containerID="cri-o://39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" gracePeriod=600 Apr 16 19:56:19.565184 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.565161 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:19.648755 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648727 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-tls-assets\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.648926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648758 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-db\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.648926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648782 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-kube-rbac-proxy\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.648926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648800 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config-out\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.648926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648816 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sftns\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-kube-api-access-sftns\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.648926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648849 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-metrics-client-certs\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.648926 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648907 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648933 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-metrics-client-ca\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648958 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.648988 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-serving-certs-ca-bundle\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649040 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-grpc-tls\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649069 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649100 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-kubelet-serving-ca-bundle\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649126 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-trusted-ca-bundle\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649153 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-web-config\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649181 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-rulefiles-0\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649242 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649225 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-tls\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.649747 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.649278 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-thanos-prometheus-http-client-file\") pod \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\" (UID: \"89da7cf8-3e31-4de4-a28a-a83ce3eb145c\") " Apr 16 19:56:19.650041 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.650012 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:19.650545 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.650514 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.651404 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.651368 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.651510 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.651417 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config-out" (OuterVolumeSpecName: "config-out") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:19.651510 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.651436 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:19.651663 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.651512 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-kube-api-access-sftns" (OuterVolumeSpecName: "kube-api-access-sftns") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "kube-api-access-sftns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:19.651835 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.651802 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.651972 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.651870 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.652745 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.652145 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.652745 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.652298 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.652745 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.652643 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.652745 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.652678 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.653001 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.652903 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.653001 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.652991 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config" (OuterVolumeSpecName: "config") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.653978 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.653957 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.654093 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.654077 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.654223 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.654204 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.662577 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.662558 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-web-config" (OuterVolumeSpecName: "web-config") pod "89da7cf8-3e31-4de4-a28a-a83ce3eb145c" (UID: "89da7cf8-3e31-4de4-a28a-a83ce3eb145c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.749982 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.749957 2563 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-db\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.749982 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.749980 2563 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-kube-rbac-proxy\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.749990 2563 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config-out\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750000 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sftns\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-kube-api-access-sftns\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750009 2563 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-metrics-client-certs\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750019 2563 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750028 2563 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-metrics-client-ca\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750038 2563 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750046 2563 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750056 2563 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-grpc-tls\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750065 2563 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-config\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750073 2563 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750081 2563 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750090 2563 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-web-config\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750098 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750098 2563 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750463 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750106 2563 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750463 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750115 2563 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.750463 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:19.750124 2563 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89da7cf8-3e31-4de4-a28a-a83ce3eb145c-tls-assets\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 19:56:20.439984 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.439941 2563 generic.go:358] "Generic (PLEG): container finished" podID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" exitCode=0 Apr 16 19:56:20.439984 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.439971 2563 generic.go:358] "Generic (PLEG): container finished" podID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" exitCode=0 Apr 16 19:56:20.439984 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.439977 2563 generic.go:358] "Generic (PLEG): container finished" podID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" exitCode=0 Apr 16 19:56:20.439984 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.439986 2563 generic.go:358] "Generic (PLEG): container finished" podID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" exitCode=0 Apr 16 19:56:20.439984 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.439993 2563 generic.go:358] "Generic (PLEG): container finished" podID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" exitCode=0 Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440001 2563 generic.go:358] "Generic (PLEG): container finished" podID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" exitCode=0 Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440029 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440070 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440081 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440090 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440099 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440107 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440041 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440128 2563 scope.go:117] "RemoveContainer" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.440523 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.440116 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89da7cf8-3e31-4de4-a28a-a83ce3eb145c","Type":"ContainerDied","Data":"cfa396a4339c41bebde23b212600ec5fb52bfed2f7eee50bd8cbc39e15d70f54"} Apr 16 19:56:20.447068 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.447055 2563 scope.go:117] "RemoveContainer" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.452960 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.452939 2563 scope.go:117] "RemoveContainer" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.458757 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.458742 2563 scope.go:117] "RemoveContainer" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.461312 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.461295 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:20.465683 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.465667 2563 scope.go:117] "RemoveContainer" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.465918 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.465897 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:20.471449 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.471430 2563 scope.go:117] "RemoveContainer" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.477678 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.477663 2563 scope.go:117] "RemoveContainer" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.483131 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.483119 2563 scope.go:117] "RemoveContainer" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.483330 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:56:20.483314 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": container with ID starting with 3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a not found: ID does not exist" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.483378 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.483336 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} err="failed to get container status \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": rpc error: code = NotFound desc = could not find container \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": container with ID starting with 3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a not found: ID does not exist" Apr 16 19:56:20.483378 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.483353 2563 scope.go:117] "RemoveContainer" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.483568 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:56:20.483552 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": container with ID starting with ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e not found: ID does not exist" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.483620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.483572 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} err="failed to get container status \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": rpc error: code = NotFound desc = could not find container \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": container with ID starting with ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e not found: ID does not exist" Apr 16 19:56:20.483620 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.483586 2563 scope.go:117] "RemoveContainer" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.483806 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:56:20.483789 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": container with ID starting with 4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55 not found: ID does not exist" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.483844 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.483811 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} err="failed to get container status \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": rpc error: code = NotFound desc = could not find container \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": container with ID starting with 4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55 not found: ID does not exist" Apr 16 19:56:20.483844 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.483834 2563 scope.go:117] "RemoveContainer" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.484034 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:56:20.484020 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": container with ID starting with 39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569 not found: ID does not exist" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.484073 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484039 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} err="failed to get container status \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": rpc error: code = NotFound desc = could not find container \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": container with ID starting with 39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569 not found: ID does not exist" Apr 16 19:56:20.484073 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484051 2563 scope.go:117] "RemoveContainer" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.484269 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:56:20.484252 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": container with ID starting with 15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b not found: ID does not exist" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.484303 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484274 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} err="failed to get container status \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": rpc error: code = NotFound desc = could not find container \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": container with ID starting with 15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b not found: ID does not exist" Apr 16 19:56:20.484303 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484291 2563 scope.go:117] "RemoveContainer" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.484503 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:56:20.484488 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": container with ID starting with a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48 not found: ID does not exist" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.484560 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484505 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} err="failed to get container status \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": rpc error: code = NotFound desc = could not find container \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": container with ID starting with a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48 not found: ID does not exist" Apr 16 19:56:20.484560 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484516 2563 scope.go:117] "RemoveContainer" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.484749 ip-10-0-135-244 kubenswrapper[2563]: E0416 19:56:20.484734 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": container with ID starting with 83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78 not found: ID does not exist" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.484790 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484753 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78"} err="failed to get container status \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": rpc error: code = NotFound desc = could not find container \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": container with ID starting with 83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78 not found: ID does not exist" Apr 16 19:56:20.484790 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484766 2563 scope.go:117] "RemoveContainer" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.484960 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484945 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} err="failed to get container status \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": rpc error: code = NotFound desc = could not find container \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": container with ID starting with 3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a not found: ID does not exist" Apr 16 19:56:20.485003 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.484960 2563 scope.go:117] "RemoveContainer" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.485181 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485150 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} err="failed to get container status \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": rpc error: code = NotFound desc = could not find container \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": container with ID starting with ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e not found: ID does not exist" Apr 16 19:56:20.485218 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485182 2563 scope.go:117] "RemoveContainer" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.485380 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485363 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} err="failed to get container status \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": rpc error: code = NotFound desc = could not find container \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": container with ID starting with 4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55 not found: ID does not exist" Apr 16 19:56:20.485380 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485379 2563 scope.go:117] "RemoveContainer" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.485622 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485588 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} err="failed to get container status \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": rpc error: code = NotFound desc = could not find container \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": container with ID starting with 39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569 not found: ID does not exist" Apr 16 19:56:20.485622 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485619 2563 scope.go:117] "RemoveContainer" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.485859 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485842 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} err="failed to get container status \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": rpc error: code = NotFound desc = could not find container \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": container with ID starting with 15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b not found: ID does not exist" Apr 16 19:56:20.485859 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.485859 2563 scope.go:117] "RemoveContainer" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.486133 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486095 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} err="failed to get container status \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": rpc error: code = NotFound desc = could not find container \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": container with ID starting with a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48 not found: ID does not exist" Apr 16 19:56:20.486201 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486135 2563 scope.go:117] "RemoveContainer" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.486356 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486339 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78"} err="failed to get container status \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": rpc error: code = NotFound desc = could not find container \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": container with ID starting with 83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78 not found: ID does not exist" Apr 16 19:56:20.486400 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486358 2563 scope.go:117] "RemoveContainer" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.486558 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486537 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} err="failed to get container status \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": rpc error: code = NotFound desc = could not find container \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": container with ID starting with 3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a not found: ID does not exist" Apr 16 19:56:20.486642 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486560 2563 scope.go:117] "RemoveContainer" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.486827 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486807 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} err="failed to get container status \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": rpc error: code = NotFound desc = could not find container \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": container with ID starting with ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e not found: ID does not exist" Apr 16 19:56:20.486878 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.486828 2563 scope.go:117] "RemoveContainer" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.487029 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487011 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} err="failed to get container status \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": rpc error: code = NotFound desc = could not find container \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": container with ID starting with 4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55 not found: ID does not exist" Apr 16 19:56:20.487068 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487030 2563 scope.go:117] "RemoveContainer" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.487234 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487214 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} err="failed to get container status \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": rpc error: code = NotFound desc = could not find container \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": container with ID starting with 39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569 not found: ID does not exist" Apr 16 19:56:20.487303 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487237 2563 scope.go:117] "RemoveContainer" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.487451 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487435 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} err="failed to get container status \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": rpc error: code = NotFound desc = could not find container \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": container with ID starting with 15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b not found: ID does not exist" Apr 16 19:56:20.487498 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487452 2563 scope.go:117] "RemoveContainer" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.487635 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487617 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} err="failed to get container status \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": rpc error: code = NotFound desc = could not find container \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": container with ID starting with a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48 not found: ID does not exist" Apr 16 19:56:20.487675 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487635 2563 scope.go:117] "RemoveContainer" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.487833 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487816 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78"} err="failed to get container status \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": rpc error: code = NotFound desc = could not find container \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": container with ID starting with 83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78 not found: ID does not exist" Apr 16 19:56:20.487893 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.487835 2563 scope.go:117] "RemoveContainer" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.488031 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488016 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} err="failed to get container status \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": rpc error: code = NotFound desc = could not find container \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": container with ID starting with 3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a not found: ID does not exist" Apr 16 19:56:20.488085 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488030 2563 scope.go:117] "RemoveContainer" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.488252 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488234 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} err="failed to get container status \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": rpc error: code = NotFound desc = could not find container \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": container with ID starting with ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e not found: ID does not exist" Apr 16 19:56:20.488303 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488252 2563 scope.go:117] "RemoveContainer" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.488479 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488463 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} err="failed to get container status \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": rpc error: code = NotFound desc = could not find container \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": container with ID starting with 4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55 not found: ID does not exist" Apr 16 19:56:20.488521 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488480 2563 scope.go:117] "RemoveContainer" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.488714 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488693 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} err="failed to get container status \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": rpc error: code = NotFound desc = could not find container \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": container with ID starting with 39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569 not found: ID does not exist" Apr 16 19:56:20.488714 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488714 2563 scope.go:117] "RemoveContainer" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.488917 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488901 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} err="failed to get container status \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": rpc error: code = NotFound desc = could not find container \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": container with ID starting with 15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b not found: ID does not exist" Apr 16 19:56:20.488960 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.488917 2563 scope.go:117] "RemoveContainer" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.489138 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489117 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} err="failed to get container status \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": rpc error: code = NotFound desc = could not find container \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": container with ID starting with a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48 not found: ID does not exist" Apr 16 19:56:20.489138 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489137 2563 scope.go:117] "RemoveContainer" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.489337 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489322 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78"} err="failed to get container status \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": rpc error: code = NotFound desc = could not find container \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": container with ID starting with 83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78 not found: ID does not exist" Apr 16 19:56:20.489337 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489337 2563 scope.go:117] "RemoveContainer" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.489527 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489505 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} err="failed to get container status \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": rpc error: code = NotFound desc = could not find container \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": container with ID starting with 3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a not found: ID does not exist" Apr 16 19:56:20.489580 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489529 2563 scope.go:117] "RemoveContainer" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.489757 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489738 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} err="failed to get container status \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": rpc error: code = NotFound desc = could not find container \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": container with ID starting with ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e not found: ID does not exist" Apr 16 19:56:20.489830 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489770 2563 scope.go:117] "RemoveContainer" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.489992 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489975 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} err="failed to get container status \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": rpc error: code = NotFound desc = could not find container \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": container with ID starting with 4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55 not found: ID does not exist" Apr 16 19:56:20.490044 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.489992 2563 scope.go:117] "RemoveContainer" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.490197 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490180 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} err="failed to get container status \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": rpc error: code = NotFound desc = could not find container \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": container with ID starting with 39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569 not found: ID does not exist" Apr 16 19:56:20.490235 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490197 2563 scope.go:117] "RemoveContainer" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.490365 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490347 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} err="failed to get container status \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": rpc error: code = NotFound desc = could not find container \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": container with ID starting with 15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b not found: ID does not exist" Apr 16 19:56:20.490427 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490365 2563 scope.go:117] "RemoveContainer" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.490572 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490556 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} err="failed to get container status \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": rpc error: code = NotFound desc = could not find container \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": container with ID starting with a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48 not found: ID does not exist" Apr 16 19:56:20.490572 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490570 2563 scope.go:117] "RemoveContainer" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.490778 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490763 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78"} err="failed to get container status \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": rpc error: code = NotFound desc = could not find container \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": container with ID starting with 83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78 not found: ID does not exist" Apr 16 19:56:20.490778 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490776 2563 scope.go:117] "RemoveContainer" containerID="3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a" Apr 16 19:56:20.490985 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490965 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a"} err="failed to get container status \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": rpc error: code = NotFound desc = could not find container \"3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a\": container with ID starting with 3ffa0aa78a587ac21091293d1bbf42c0f85b8e7c48ad5c42c0d878034f61784a not found: ID does not exist" Apr 16 19:56:20.491032 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.490986 2563 scope.go:117] "RemoveContainer" containerID="ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e" Apr 16 19:56:20.491153 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491139 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e"} err="failed to get container status \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": rpc error: code = NotFound desc = could not find container \"ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e\": container with ID starting with ea2d720150f8fb02a49f459ebb95f41627cd2b004f67e4f1631a63746fc3669e not found: ID does not exist" Apr 16 19:56:20.491197 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491153 2563 scope.go:117] "RemoveContainer" containerID="4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55" Apr 16 19:56:20.491337 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491317 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55"} err="failed to get container status \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": rpc error: code = NotFound desc = could not find container \"4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55\": container with ID starting with 4b5e208289726e218cd3f90312ddacdc422035eb679a6b46ba3c25682bbb9e55 not found: ID does not exist" Apr 16 19:56:20.491403 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491338 2563 scope.go:117] "RemoveContainer" containerID="39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569" Apr 16 19:56:20.491547 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491533 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569"} err="failed to get container status \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": rpc error: code = NotFound desc = could not find container \"39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569\": container with ID starting with 39edaa60a72dab8ae0bd08e56387871680ec7fd777e42ff72ec1c908e08fc569 not found: ID does not exist" Apr 16 19:56:20.491592 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491547 2563 scope.go:117] "RemoveContainer" containerID="15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b" Apr 16 19:56:20.491765 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491748 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b"} err="failed to get container status \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": rpc error: code = NotFound desc = could not find container \"15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b\": container with ID starting with 15aea4e03a522fcf2665638c2a2529727e202787b7322d508aa6cb834353fd0b not found: ID does not exist" Apr 16 19:56:20.491808 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491766 2563 scope.go:117] "RemoveContainer" containerID="a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48" Apr 16 19:56:20.491972 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491953 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48"} err="failed to get container status \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": rpc error: code = NotFound desc = could not find container \"a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48\": container with ID starting with a583031fcbeed90388968072d494588a61d62d037676f94ad8c3dc2cba47cf48 not found: ID does not exist" Apr 16 19:56:20.492015 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.491973 2563 scope.go:117] "RemoveContainer" containerID="83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78" Apr 16 19:56:20.492158 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.492143 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78"} err="failed to get container status \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": rpc error: code = NotFound desc = could not find container \"83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78\": container with ID starting with 83abd029a2247360aab3e753e76532cdd0121e152e9c6c6ef5b3b96b9fcc8d78 not found: ID does not exist" Apr 16 19:56:20.494809 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.494791 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:20.495070 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495056 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="thanos-sidecar" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495073 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="thanos-sidecar" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495089 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="config-reloader" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495097 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="config-reloader" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495106 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-web" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495114 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-web" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495123 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495132 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495144 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b325581-2334-4c35-ade9-6b22c9297769" containerName="registry" Apr 16 19:56:20.495149 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495151 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b325581-2334-4c35-ade9-6b22c9297769" containerName="registry" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495164 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="init-config-reloader" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495173 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="init-config-reloader" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495182 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="prometheus" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495190 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="prometheus" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495202 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495210 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495274 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="prometheus" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495284 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="config-reloader" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495295 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-web" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495303 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495313 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="thanos-sidecar" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495323 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b325581-2334-4c35-ade9-6b22c9297769" containerName="registry" Apr 16 19:56:20.495536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.495334 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:20.500095 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.500080 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.503048 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503017 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:56:20.503144 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503110 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ahu2531gpgd6s\"" Apr 16 19:56:20.503208 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503195 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:56:20.503358 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503345 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:56:20.503418 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503358 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:56:20.503477 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503345 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:56:20.503552 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503537 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:56:20.503660 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.503646 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:56:20.504165 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.504125 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:56:20.504165 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.504162 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:56:20.504326 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.504202 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:56:20.504429 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.504331 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-r4mw4\"" Apr 16 19:56:20.504429 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.504372 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:56:20.506792 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.506775 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:56:20.508188 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.508171 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:56:20.511269 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.511251 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:20.556776 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556750 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.556865 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556779 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.556865 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556798 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.556865 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556816 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2226d070-e026-4dbf-8dad-527a1aa3eb7d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.556865 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556832 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.556865 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556862 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557049 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556881 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j495g\" (UniqueName: \"kubernetes.io/projected/2226d070-e026-4dbf-8dad-527a1aa3eb7d-kube-api-access-j495g\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557049 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556901 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557049 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556939 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557049 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.556987 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557049 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557009 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557049 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557025 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-web-config\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557049 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557044 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557269 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557091 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2226d070-e026-4dbf-8dad-527a1aa3eb7d-config-out\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557269 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557107 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-config\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557269 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557123 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557269 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557141 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.557269 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.557155 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.658334 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658303 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.658334 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658338 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.658525 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658356 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.658525 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658377 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2226d070-e026-4dbf-8dad-527a1aa3eb7d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.658631 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658588 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.658679 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658662 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.658723 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658703 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j495g\" (UniqueName: \"kubernetes.io/projected/2226d070-e026-4dbf-8dad-527a1aa3eb7d-kube-api-access-j495g\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658901 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658940 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.658996 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659026 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659051 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-web-config\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659080 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659143 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2226d070-e026-4dbf-8dad-527a1aa3eb7d-config-out\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659171 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-config\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659189 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659200 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659237 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659252 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.659441 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659290 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.660423 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659485 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.660423 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.659862 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.661501 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.661470 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2226d070-e026-4dbf-8dad-527a1aa3eb7d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.661626 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.661588 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.661703 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.661685 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.661767 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.661717 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.661819 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.661799 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.662136 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.662112 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.662437 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.662415 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-web-config\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.662958 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.662939 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.663548 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.663530 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.664030 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.664014 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2226d070-e026-4dbf-8dad-527a1aa3eb7d-config-out\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.664309 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.664289 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-config\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.664348 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.664329 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2226d070-e026-4dbf-8dad-527a1aa3eb7d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.664962 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.664944 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2226d070-e026-4dbf-8dad-527a1aa3eb7d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.667306 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.667285 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j495g\" (UniqueName: \"kubernetes.io/projected/2226d070-e026-4dbf-8dad-527a1aa3eb7d-kube-api-access-j495g\") pod \"prometheus-k8s-0\" (UID: \"2226d070-e026-4dbf-8dad-527a1aa3eb7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.809468 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.809429 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:20.930212 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:20.930187 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:20.932346 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:56:20.932321 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2226d070_e026_4dbf_8dad_527a1aa3eb7d.slice/crio-5c62ce5eb167bda1d611d862b2c1d9f2adc4330bea2b8e43baf687cdf61ad190 WatchSource:0}: Error finding container 5c62ce5eb167bda1d611d862b2c1d9f2adc4330bea2b8e43baf687cdf61ad190: Status 404 returned error can't find the container with id 5c62ce5eb167bda1d611d862b2c1d9f2adc4330bea2b8e43baf687cdf61ad190 Apr 16 19:56:21.444676 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:21.444641 2563 generic.go:358] "Generic (PLEG): container finished" podID="2226d070-e026-4dbf-8dad-527a1aa3eb7d" containerID="4700b584a55e0d15a63fb73011b6cb66e9c2a22b22db4778ae3e3d46590b4bc0" exitCode=0 Apr 16 19:56:21.445044 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:21.444694 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerDied","Data":"4700b584a55e0d15a63fb73011b6cb66e9c2a22b22db4778ae3e3d46590b4bc0"} Apr 16 19:56:21.445044 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:21.444713 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerStarted","Data":"5c62ce5eb167bda1d611d862b2c1d9f2adc4330bea2b8e43baf687cdf61ad190"} Apr 16 19:56:21.799326 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:21.799297 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89da7cf8-3e31-4de4-a28a-a83ce3eb145c" path="/var/lib/kubelet/pods/89da7cf8-3e31-4de4-a28a-a83ce3eb145c/volumes" Apr 16 19:56:22.450307 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:22.450275 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerStarted","Data":"d0aebb0e9d38ac5e676dcc3493b03ff6bf5d5574475d90dd75228fc34898050a"} Apr 16 19:56:22.450307 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:22.450306 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerStarted","Data":"bad62db36ebfb1bb57fcaca3a4b7a26d18e3f95f80a5aa285492d9e64189fde5"} Apr 16 19:56:22.450717 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:22.450318 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerStarted","Data":"d92e791ce68077eb9156690526ac95b5d8d270e2d70f42f0e2141cbf514260d8"} Apr 16 19:56:22.450717 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:22.450326 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerStarted","Data":"3e8ed4e0b52e2347e7d6a15266e6b7b0148e8a04084370c30cf04d90bb3e451a"} Apr 16 19:56:22.450717 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:22.450335 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerStarted","Data":"28160ab1ad0071cdc1aaf838c980459aebd7c08cd703c5c07caad7d50d2bce29"} Apr 16 19:56:22.450717 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:22.450342 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2226d070-e026-4dbf-8dad-527a1aa3eb7d","Type":"ContainerStarted","Data":"b2942d1eb9e5795db78c9fd9d2e635f641d5110b62c02537fab4fd62ff22db51"} Apr 16 19:56:22.481402 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:22.478274 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.478257866 podStartE2EDuration="2.478257866s" podCreationTimestamp="2026-04-16 19:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:22.475359993 +0000 UTC m=+161.251937016" watchObservedRunningTime="2026-04-16 19:56:22.478257866 +0000 UTC m=+161.254834865" Apr 16 19:56:25.810529 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:56:25.810491 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:20.810544 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:57:20.810497 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:20.825450 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:57:20.825414 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:21.622215 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:57:21.622190 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:58:41.648268 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:58:41.648243 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 19:58:41.650191 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:58:41.650171 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 19:58:41.655057 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:58:41.655039 2563 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:59:56.884885 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.884805 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-twnvx"] Apr 16 19:59:56.887831 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.887814 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:56.890506 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.890483 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 19:59:56.890667 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.890509 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 19:59:56.890805 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.890789 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 19:59:56.891795 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.891779 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-n4npf\"" Apr 16 19:59:56.904407 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.904381 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-twnvx"] Apr 16 19:59:56.975451 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.975423 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2dd12e1f-5557-40d7-8f6d-72087eb846db-data\") pod \"seaweedfs-86cc847c5c-twnvx\" (UID: \"2dd12e1f-5557-40d7-8f6d-72087eb846db\") " pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:56.975451 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:56.975454 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tttxw\" (UniqueName: \"kubernetes.io/projected/2dd12e1f-5557-40d7-8f6d-72087eb846db-kube-api-access-tttxw\") pod \"seaweedfs-86cc847c5c-twnvx\" (UID: \"2dd12e1f-5557-40d7-8f6d-72087eb846db\") " pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:57.080808 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:57.080584 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2dd12e1f-5557-40d7-8f6d-72087eb846db-data\") pod \"seaweedfs-86cc847c5c-twnvx\" (UID: \"2dd12e1f-5557-40d7-8f6d-72087eb846db\") " pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:57.081068 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:57.081045 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tttxw\" (UniqueName: \"kubernetes.io/projected/2dd12e1f-5557-40d7-8f6d-72087eb846db-kube-api-access-tttxw\") pod \"seaweedfs-86cc847c5c-twnvx\" (UID: \"2dd12e1f-5557-40d7-8f6d-72087eb846db\") " pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:57.081314 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:57.081297 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2dd12e1f-5557-40d7-8f6d-72087eb846db-data\") pod \"seaweedfs-86cc847c5c-twnvx\" (UID: \"2dd12e1f-5557-40d7-8f6d-72087eb846db\") " pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:57.090536 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:57.090509 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tttxw\" (UniqueName: \"kubernetes.io/projected/2dd12e1f-5557-40d7-8f6d-72087eb846db-kube-api-access-tttxw\") pod \"seaweedfs-86cc847c5c-twnvx\" (UID: \"2dd12e1f-5557-40d7-8f6d-72087eb846db\") " pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:57.197237 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:57.197164 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 19:59:57.313175 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:57.313147 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-twnvx"] Apr 16 19:59:57.316319 ip-10-0-135-244 kubenswrapper[2563]: W0416 19:59:57.316290 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd12e1f_5557_40d7_8f6d_72087eb846db.slice/crio-c26e347f5661a1bf815f1ecf13dcb2ba08cfc5b58e374686842adeca2e200727 WatchSource:0}: Error finding container c26e347f5661a1bf815f1ecf13dcb2ba08cfc5b58e374686842adeca2e200727: Status 404 returned error can't find the container with id c26e347f5661a1bf815f1ecf13dcb2ba08cfc5b58e374686842adeca2e200727 Apr 16 19:59:57.317465 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:57.317448 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:59:58.006456 ip-10-0-135-244 kubenswrapper[2563]: I0416 19:59:58.006413 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-twnvx" event={"ID":"2dd12e1f-5557-40d7-8f6d-72087eb846db","Type":"ContainerStarted","Data":"c26e347f5661a1bf815f1ecf13dcb2ba08cfc5b58e374686842adeca2e200727"} Apr 16 20:00:00.013293 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:00.013260 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-twnvx" event={"ID":"2dd12e1f-5557-40d7-8f6d-72087eb846db","Type":"ContainerStarted","Data":"e4e3704a03fec3bd68e14c9a929f7eecd76649792debccdf62db014a80db674c"} Apr 16 20:00:00.013690 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:00.013311 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 20:00:00.031802 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:00.031714 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-twnvx" podStartSLOduration=1.47143045 podStartE2EDuration="4.03169716s" podCreationTimestamp="2026-04-16 19:59:56 +0000 UTC" firstStartedPulling="2026-04-16 19:59:57.317570222 +0000 UTC m=+376.094147199" lastFinishedPulling="2026-04-16 19:59:59.877836927 +0000 UTC m=+378.654413909" observedRunningTime="2026-04-16 20:00:00.030823962 +0000 UTC m=+378.807400971" watchObservedRunningTime="2026-04-16 20:00:00.03169716 +0000 UTC m=+378.808274160" Apr 16 20:00:06.018294 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:06.018259 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-twnvx" Apr 16 20:00:33.820110 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.820073 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-zcv4s"] Apr 16 20:00:33.822946 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.822927 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:33.825731 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.825710 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 20:00:33.826941 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.826925 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-drflm\"" Apr 16 20:00:33.831029 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.831008 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-zcv4s"] Apr 16 20:00:33.846887 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.846864 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vtb6\" (UniqueName: \"kubernetes.io/projected/81118176-8453-47ec-aebb-16dbdb36e543-kube-api-access-8vtb6\") pod \"kserve-controller-manager-659c8cbdc-zcv4s\" (UID: \"81118176-8453-47ec-aebb-16dbdb36e543\") " pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:33.846997 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.846897 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81118176-8453-47ec-aebb-16dbdb36e543-cert\") pod \"kserve-controller-manager-659c8cbdc-zcv4s\" (UID: \"81118176-8453-47ec-aebb-16dbdb36e543\") " pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:33.947327 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.947296 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vtb6\" (UniqueName: \"kubernetes.io/projected/81118176-8453-47ec-aebb-16dbdb36e543-kube-api-access-8vtb6\") pod \"kserve-controller-manager-659c8cbdc-zcv4s\" (UID: \"81118176-8453-47ec-aebb-16dbdb36e543\") " pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:33.947453 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.947335 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81118176-8453-47ec-aebb-16dbdb36e543-cert\") pod \"kserve-controller-manager-659c8cbdc-zcv4s\" (UID: \"81118176-8453-47ec-aebb-16dbdb36e543\") " pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:33.949687 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.949665 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81118176-8453-47ec-aebb-16dbdb36e543-cert\") pod \"kserve-controller-manager-659c8cbdc-zcv4s\" (UID: \"81118176-8453-47ec-aebb-16dbdb36e543\") " pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:33.958785 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:33.958765 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vtb6\" (UniqueName: \"kubernetes.io/projected/81118176-8453-47ec-aebb-16dbdb36e543-kube-api-access-8vtb6\") pod \"kserve-controller-manager-659c8cbdc-zcv4s\" (UID: \"81118176-8453-47ec-aebb-16dbdb36e543\") " pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:34.133348 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:34.133317 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:34.251591 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:34.251563 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-zcv4s"] Apr 16 20:00:34.254739 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:00:34.254714 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81118176_8453_47ec_aebb_16dbdb36e543.slice/crio-b97812722e08951f7d1104edcd28a545ae2fd0b616bdb6e2013cfd403645549a WatchSource:0}: Error finding container b97812722e08951f7d1104edcd28a545ae2fd0b616bdb6e2013cfd403645549a: Status 404 returned error can't find the container with id b97812722e08951f7d1104edcd28a545ae2fd0b616bdb6e2013cfd403645549a Apr 16 20:00:35.112385 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:35.112348 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" event={"ID":"81118176-8453-47ec-aebb-16dbdb36e543","Type":"ContainerStarted","Data":"b97812722e08951f7d1104edcd28a545ae2fd0b616bdb6e2013cfd403645549a"} Apr 16 20:00:37.118447 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:37.118415 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" event={"ID":"81118176-8453-47ec-aebb-16dbdb36e543","Type":"ContainerStarted","Data":"dc4023f353c6bb532217c09782aa1100cbf927495ea8252aa1a2b066120a1b64"} Apr 16 20:00:37.118925 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:37.118560 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:00:37.140955 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:00:37.140909 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" podStartSLOduration=1.577590316 podStartE2EDuration="4.140894097s" podCreationTimestamp="2026-04-16 20:00:33 +0000 UTC" firstStartedPulling="2026-04-16 20:00:34.256073311 +0000 UTC m=+413.032650289" lastFinishedPulling="2026-04-16 20:00:36.819377084 +0000 UTC m=+415.595954070" observedRunningTime="2026-04-16 20:00:37.139439689 +0000 UTC m=+415.916016709" watchObservedRunningTime="2026-04-16 20:00:37.140894097 +0000 UTC m=+415.917471096" Apr 16 20:01:08.125940 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:08.125866 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-zcv4s" Apr 16 20:01:25.586690 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:25.586661 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-zk4jn"] Apr 16 20:01:25.589890 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:25.589869 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zk4jn" Apr 16 20:01:25.596007 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:25.595980 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-zk4jn"] Apr 16 20:01:25.715211 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:25.715181 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5xj\" (UniqueName: \"kubernetes.io/projected/186fa644-f8cd-4d1f-af46-25ac0647fcc4-kube-api-access-rq5xj\") pod \"s3-init-zk4jn\" (UID: \"186fa644-f8cd-4d1f-af46-25ac0647fcc4\") " pod="kserve/s3-init-zk4jn" Apr 16 20:01:25.815902 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:25.815873 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5xj\" (UniqueName: \"kubernetes.io/projected/186fa644-f8cd-4d1f-af46-25ac0647fcc4-kube-api-access-rq5xj\") pod \"s3-init-zk4jn\" (UID: \"186fa644-f8cd-4d1f-af46-25ac0647fcc4\") " pod="kserve/s3-init-zk4jn" Apr 16 20:01:25.826333 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:25.826303 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5xj\" (UniqueName: \"kubernetes.io/projected/186fa644-f8cd-4d1f-af46-25ac0647fcc4-kube-api-access-rq5xj\") pod \"s3-init-zk4jn\" (UID: \"186fa644-f8cd-4d1f-af46-25ac0647fcc4\") " pod="kserve/s3-init-zk4jn" Apr 16 20:01:25.911361 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:25.911334 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zk4jn" Apr 16 20:01:26.027158 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:26.027135 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-zk4jn"] Apr 16 20:01:26.029438 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:01:26.029405 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod186fa644_f8cd_4d1f_af46_25ac0647fcc4.slice/crio-62cddaaaaeedb246105761877b96d290059ce55b99a7f54a9e2186259854aa45 WatchSource:0}: Error finding container 62cddaaaaeedb246105761877b96d290059ce55b99a7f54a9e2186259854aa45: Status 404 returned error can't find the container with id 62cddaaaaeedb246105761877b96d290059ce55b99a7f54a9e2186259854aa45 Apr 16 20:01:26.251042 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:26.250969 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zk4jn" event={"ID":"186fa644-f8cd-4d1f-af46-25ac0647fcc4","Type":"ContainerStarted","Data":"62cddaaaaeedb246105761877b96d290059ce55b99a7f54a9e2186259854aa45"} Apr 16 20:01:31.267043 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:31.267001 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zk4jn" event={"ID":"186fa644-f8cd-4d1f-af46-25ac0647fcc4","Type":"ContainerStarted","Data":"f2790c1fd8a83ce90c79daea03a25063c7156f78f93adfa72b20c3cb2b0c569c"} Apr 16 20:01:31.283348 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:31.283296 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-zk4jn" podStartSLOduration=1.5346862 podStartE2EDuration="6.28327887s" podCreationTimestamp="2026-04-16 20:01:25 +0000 UTC" firstStartedPulling="2026-04-16 20:01:26.031536483 +0000 UTC m=+464.808113460" lastFinishedPulling="2026-04-16 20:01:30.78012915 +0000 UTC m=+469.556706130" observedRunningTime="2026-04-16 20:01:31.282433083 +0000 UTC m=+470.059010081" watchObservedRunningTime="2026-04-16 20:01:31.28327887 +0000 UTC m=+470.059855870" Apr 16 20:01:34.276054 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:34.276017 2563 generic.go:358] "Generic (PLEG): container finished" podID="186fa644-f8cd-4d1f-af46-25ac0647fcc4" containerID="f2790c1fd8a83ce90c79daea03a25063c7156f78f93adfa72b20c3cb2b0c569c" exitCode=0 Apr 16 20:01:34.276423 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:34.276080 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zk4jn" event={"ID":"186fa644-f8cd-4d1f-af46-25ac0647fcc4","Type":"ContainerDied","Data":"f2790c1fd8a83ce90c79daea03a25063c7156f78f93adfa72b20c3cb2b0c569c"} Apr 16 20:01:35.396478 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:35.396456 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zk4jn" Apr 16 20:01:35.493052 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:35.493019 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq5xj\" (UniqueName: \"kubernetes.io/projected/186fa644-f8cd-4d1f-af46-25ac0647fcc4-kube-api-access-rq5xj\") pod \"186fa644-f8cd-4d1f-af46-25ac0647fcc4\" (UID: \"186fa644-f8cd-4d1f-af46-25ac0647fcc4\") " Apr 16 20:01:35.495095 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:35.495061 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186fa644-f8cd-4d1f-af46-25ac0647fcc4-kube-api-access-rq5xj" (OuterVolumeSpecName: "kube-api-access-rq5xj") pod "186fa644-f8cd-4d1f-af46-25ac0647fcc4" (UID: "186fa644-f8cd-4d1f-af46-25ac0647fcc4"). InnerVolumeSpecName "kube-api-access-rq5xj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:35.593499 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:35.593426 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rq5xj\" (UniqueName: \"kubernetes.io/projected/186fa644-f8cd-4d1f-af46-25ac0647fcc4-kube-api-access-rq5xj\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 20:01:36.281812 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:36.281787 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zk4jn" Apr 16 20:01:36.281977 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:36.281782 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zk4jn" event={"ID":"186fa644-f8cd-4d1f-af46-25ac0647fcc4","Type":"ContainerDied","Data":"62cddaaaaeedb246105761877b96d290059ce55b99a7f54a9e2186259854aa45"} Apr 16 20:01:36.281977 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:36.281900 2563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62cddaaaaeedb246105761877b96d290059ce55b99a7f54a9e2186259854aa45" Apr 16 20:01:46.178237 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.178204 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-gphjt"] Apr 16 20:01:46.178695 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.178470 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="186fa644-f8cd-4d1f-af46-25ac0647fcc4" containerName="s3-init" Apr 16 20:01:46.178695 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.178481 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="186fa644-f8cd-4d1f-af46-25ac0647fcc4" containerName="s3-init" Apr 16 20:01:46.178695 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.178523 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="186fa644-f8cd-4d1f-af46-25ac0647fcc4" containerName="s3-init" Apr 16 20:01:46.181398 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.181382 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-gphjt" Apr 16 20:01:46.183980 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.183961 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 20:01:46.188783 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.188756 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-gphjt"] Apr 16 20:01:46.272295 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.272269 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz6w6\" (UniqueName: \"kubernetes.io/projected/6ed4e661-f1a5-4338-afbe-ee7ef4fd7206-kube-api-access-rz6w6\") pod \"s3-tls-init-custom-gphjt\" (UID: \"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206\") " pod="kserve/s3-tls-init-custom-gphjt" Apr 16 20:01:46.373525 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.373499 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz6w6\" (UniqueName: \"kubernetes.io/projected/6ed4e661-f1a5-4338-afbe-ee7ef4fd7206-kube-api-access-rz6w6\") pod \"s3-tls-init-custom-gphjt\" (UID: \"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206\") " pod="kserve/s3-tls-init-custom-gphjt" Apr 16 20:01:46.382334 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.382316 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz6w6\" (UniqueName: \"kubernetes.io/projected/6ed4e661-f1a5-4338-afbe-ee7ef4fd7206-kube-api-access-rz6w6\") pod \"s3-tls-init-custom-gphjt\" (UID: \"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206\") " pod="kserve/s3-tls-init-custom-gphjt" Apr 16 20:01:46.505111 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.505041 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-gphjt" Apr 16 20:01:46.623170 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:46.623143 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-gphjt"] Apr 16 20:01:46.626013 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:01:46.625987 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ed4e661_f1a5_4338_afbe_ee7ef4fd7206.slice/crio-bdeb5589dd632bde89f7dc20fb0f0d195599920d77efd7203d019a4d62060b9f WatchSource:0}: Error finding container bdeb5589dd632bde89f7dc20fb0f0d195599920d77efd7203d019a4d62060b9f: Status 404 returned error can't find the container with id bdeb5589dd632bde89f7dc20fb0f0d195599920d77efd7203d019a4d62060b9f Apr 16 20:01:47.312907 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:47.312866 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-gphjt" event={"ID":"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206","Type":"ContainerStarted","Data":"a9013907c95c7efdc5cd4d06aec6765060462b1fe6fe50f35022c5bf13329d11"} Apr 16 20:01:47.312907 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:47.312909 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-gphjt" event={"ID":"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206","Type":"ContainerStarted","Data":"bdeb5589dd632bde89f7dc20fb0f0d195599920d77efd7203d019a4d62060b9f"} Apr 16 20:01:47.329879 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:47.329836 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-gphjt" podStartSLOduration=1.329821187 podStartE2EDuration="1.329821187s" podCreationTimestamp="2026-04-16 20:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:01:47.328710675 +0000 UTC m=+486.105287675" watchObservedRunningTime="2026-04-16 20:01:47.329821187 +0000 UTC m=+486.106398185" Apr 16 20:01:51.324876 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:51.324795 2563 generic.go:358] "Generic (PLEG): container finished" podID="6ed4e661-f1a5-4338-afbe-ee7ef4fd7206" containerID="a9013907c95c7efdc5cd4d06aec6765060462b1fe6fe50f35022c5bf13329d11" exitCode=0 Apr 16 20:01:51.324876 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:51.324853 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-gphjt" event={"ID":"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206","Type":"ContainerDied","Data":"a9013907c95c7efdc5cd4d06aec6765060462b1fe6fe50f35022c5bf13329d11"} Apr 16 20:01:52.449307 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:52.449284 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-gphjt" Apr 16 20:01:52.619407 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:52.619369 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz6w6\" (UniqueName: \"kubernetes.io/projected/6ed4e661-f1a5-4338-afbe-ee7ef4fd7206-kube-api-access-rz6w6\") pod \"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206\" (UID: \"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206\") " Apr 16 20:01:52.621387 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:52.621358 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed4e661-f1a5-4338-afbe-ee7ef4fd7206-kube-api-access-rz6w6" (OuterVolumeSpecName: "kube-api-access-rz6w6") pod "6ed4e661-f1a5-4338-afbe-ee7ef4fd7206" (UID: "6ed4e661-f1a5-4338-afbe-ee7ef4fd7206"). InnerVolumeSpecName "kube-api-access-rz6w6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:52.720524 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:52.720491 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rz6w6\" (UniqueName: \"kubernetes.io/projected/6ed4e661-f1a5-4338-afbe-ee7ef4fd7206-kube-api-access-rz6w6\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 20:01:53.331209 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.331183 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-gphjt" Apr 16 20:01:53.331209 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.331200 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-gphjt" event={"ID":"6ed4e661-f1a5-4338-afbe-ee7ef4fd7206","Type":"ContainerDied","Data":"bdeb5589dd632bde89f7dc20fb0f0d195599920d77efd7203d019a4d62060b9f"} Apr 16 20:01:53.331398 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.331229 2563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdeb5589dd632bde89f7dc20fb0f0d195599920d77efd7203d019a4d62060b9f" Apr 16 20:01:53.962565 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.962529 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c"] Apr 16 20:01:53.963025 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.962930 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ed4e661-f1a5-4338-afbe-ee7ef4fd7206" containerName="s3-tls-init-custom" Apr 16 20:01:53.963025 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.962948 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed4e661-f1a5-4338-afbe-ee7ef4fd7206" containerName="s3-tls-init-custom" Apr 16 20:01:53.963025 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.963023 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ed4e661-f1a5-4338-afbe-ee7ef4fd7206" containerName="s3-tls-init-custom" Apr 16 20:01:53.995173 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.995144 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c"] Apr 16 20:01:53.995314 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.995250 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:53.998787 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.998760 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 20:01:53.999072 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:53.999051 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 20:01:54.129373 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.129338 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nkmk\" (UniqueName: \"kubernetes.io/projected/4c355f66-b85a-42de-b606-32849f924a8d-kube-api-access-5nkmk\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.129540 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.129418 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4c355f66-b85a-42de-b606-32849f924a8d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.129540 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.129469 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/4c355f66-b85a-42de-b606-32849f924a8d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.229907 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.229833 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nkmk\" (UniqueName: \"kubernetes.io/projected/4c355f66-b85a-42de-b606-32849f924a8d-kube-api-access-5nkmk\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.229907 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.229881 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4c355f66-b85a-42de-b606-32849f924a8d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.229907 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.229900 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/4c355f66-b85a-42de-b606-32849f924a8d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.230275 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.230253 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4c355f66-b85a-42de-b606-32849f924a8d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.232212 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.232192 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/4c355f66-b85a-42de-b606-32849f924a8d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.240001 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.239980 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nkmk\" (UniqueName: \"kubernetes.io/projected/4c355f66-b85a-42de-b606-32849f924a8d-kube-api-access-5nkmk\") pod \"seaweedfs-tls-serving-7fd5766db9-kgj7c\" (UID: \"4c355f66-b85a-42de-b606-32849f924a8d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.304473 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.304451 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" Apr 16 20:01:54.420871 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:54.420838 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c"] Apr 16 20:01:54.423762 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:01:54.423733 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c355f66_b85a_42de_b606_32849f924a8d.slice/crio-51a97e2c361faca860dbb0b2afdd7cdf84120b29f187ee62df09c348e7af88d4 WatchSource:0}: Error finding container 51a97e2c361faca860dbb0b2afdd7cdf84120b29f187ee62df09c348e7af88d4: Status 404 returned error can't find the container with id 51a97e2c361faca860dbb0b2afdd7cdf84120b29f187ee62df09c348e7af88d4 Apr 16 20:01:55.337878 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:55.337844 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" event={"ID":"4c355f66-b85a-42de-b606-32849f924a8d","Type":"ContainerStarted","Data":"070396d553d3dd85784671de6c40b195808c9abfe3c2f74249589d30e97bca28"} Apr 16 20:01:55.337878 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:55.337878 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" event={"ID":"4c355f66-b85a-42de-b606-32849f924a8d","Type":"ContainerStarted","Data":"51a97e2c361faca860dbb0b2afdd7cdf84120b29f187ee62df09c348e7af88d4"} Apr 16 20:01:55.354484 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:55.354441 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kgj7c" podStartSLOduration=2.063077977 podStartE2EDuration="2.354428729s" podCreationTimestamp="2026-04-16 20:01:53 +0000 UTC" firstStartedPulling="2026-04-16 20:01:54.424992929 +0000 UTC m=+493.201569909" lastFinishedPulling="2026-04-16 20:01:54.716343685 +0000 UTC m=+493.492920661" observedRunningTime="2026-04-16 20:01:55.354318613 +0000 UTC m=+494.130895609" watchObservedRunningTime="2026-04-16 20:01:55.354428729 +0000 UTC m=+494.131005757" Apr 16 20:01:55.893727 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:55.893697 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-54ggl"] Apr 16 20:01:55.898097 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:55.898079 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-54ggl" Apr 16 20:01:55.903825 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:55.903800 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-54ggl"] Apr 16 20:01:56.044701 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:56.044671 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6twm\" (UniqueName: \"kubernetes.io/projected/026cf755-d4b4-4854-a1d6-b2eb9ad872c3-kube-api-access-p6twm\") pod \"s3-tls-init-serving-54ggl\" (UID: \"026cf755-d4b4-4854-a1d6-b2eb9ad872c3\") " pod="kserve/s3-tls-init-serving-54ggl" Apr 16 20:01:56.145630 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:56.145547 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6twm\" (UniqueName: \"kubernetes.io/projected/026cf755-d4b4-4854-a1d6-b2eb9ad872c3-kube-api-access-p6twm\") pod \"s3-tls-init-serving-54ggl\" (UID: \"026cf755-d4b4-4854-a1d6-b2eb9ad872c3\") " pod="kserve/s3-tls-init-serving-54ggl" Apr 16 20:01:56.156117 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:56.156086 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6twm\" (UniqueName: \"kubernetes.io/projected/026cf755-d4b4-4854-a1d6-b2eb9ad872c3-kube-api-access-p6twm\") pod \"s3-tls-init-serving-54ggl\" (UID: \"026cf755-d4b4-4854-a1d6-b2eb9ad872c3\") " pod="kserve/s3-tls-init-serving-54ggl" Apr 16 20:01:56.217409 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:56.217383 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-54ggl" Apr 16 20:01:56.346956 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:56.346927 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-54ggl"] Apr 16 20:01:56.350523 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:01:56.350499 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod026cf755_d4b4_4854_a1d6_b2eb9ad872c3.slice/crio-4e0c6aedd43440b12c094dff8ff78d6643f37cb94fd9987671d7cb967a81b9f2 WatchSource:0}: Error finding container 4e0c6aedd43440b12c094dff8ff78d6643f37cb94fd9987671d7cb967a81b9f2: Status 404 returned error can't find the container with id 4e0c6aedd43440b12c094dff8ff78d6643f37cb94fd9987671d7cb967a81b9f2 Apr 16 20:01:57.343580 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:57.343548 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-54ggl" event={"ID":"026cf755-d4b4-4854-a1d6-b2eb9ad872c3","Type":"ContainerStarted","Data":"f976019c7016700c839cbd769f5e89cf96727a7a500a7514912352c9c233e2dc"} Apr 16 20:01:57.343580 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:57.343580 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-54ggl" event={"ID":"026cf755-d4b4-4854-a1d6-b2eb9ad872c3","Type":"ContainerStarted","Data":"4e0c6aedd43440b12c094dff8ff78d6643f37cb94fd9987671d7cb967a81b9f2"} Apr 16 20:01:57.361675 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:01:57.361633 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-54ggl" podStartSLOduration=2.361619348 podStartE2EDuration="2.361619348s" podCreationTimestamp="2026-04-16 20:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:01:57.359508742 +0000 UTC m=+496.136085741" watchObservedRunningTime="2026-04-16 20:01:57.361619348 +0000 UTC m=+496.138196346" Apr 16 20:02:00.353459 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:00.353380 2563 generic.go:358] "Generic (PLEG): container finished" podID="026cf755-d4b4-4854-a1d6-b2eb9ad872c3" containerID="f976019c7016700c839cbd769f5e89cf96727a7a500a7514912352c9c233e2dc" exitCode=0 Apr 16 20:02:00.353459 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:00.353434 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-54ggl" event={"ID":"026cf755-d4b4-4854-a1d6-b2eb9ad872c3","Type":"ContainerDied","Data":"f976019c7016700c839cbd769f5e89cf96727a7a500a7514912352c9c233e2dc"} Apr 16 20:02:01.477931 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:01.477911 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-54ggl" Apr 16 20:02:01.586304 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:01.586277 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6twm\" (UniqueName: \"kubernetes.io/projected/026cf755-d4b4-4854-a1d6-b2eb9ad872c3-kube-api-access-p6twm\") pod \"026cf755-d4b4-4854-a1d6-b2eb9ad872c3\" (UID: \"026cf755-d4b4-4854-a1d6-b2eb9ad872c3\") " Apr 16 20:02:01.588172 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:01.588146 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026cf755-d4b4-4854-a1d6-b2eb9ad872c3-kube-api-access-p6twm" (OuterVolumeSpecName: "kube-api-access-p6twm") pod "026cf755-d4b4-4854-a1d6-b2eb9ad872c3" (UID: "026cf755-d4b4-4854-a1d6-b2eb9ad872c3"). InnerVolumeSpecName "kube-api-access-p6twm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:02:01.687220 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:01.687194 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6twm\" (UniqueName: \"kubernetes.io/projected/026cf755-d4b4-4854-a1d6-b2eb9ad872c3-kube-api-access-p6twm\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 20:02:02.359116 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:02.359079 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-54ggl" event={"ID":"026cf755-d4b4-4854-a1d6-b2eb9ad872c3","Type":"ContainerDied","Data":"4e0c6aedd43440b12c094dff8ff78d6643f37cb94fd9987671d7cb967a81b9f2"} Apr 16 20:02:02.359116 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:02.359111 2563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e0c6aedd43440b12c094dff8ff78d6643f37cb94fd9987671d7cb967a81b9f2" Apr 16 20:02:02.359116 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:02:02.359118 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-54ggl" Apr 16 20:03:41.669507 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:03:41.669461 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:03:41.670133 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:03:41.670115 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:05:22.248150 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.248116 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5"] Apr 16 20:05:22.248572 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.248558 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="026cf755-d4b4-4854-a1d6-b2eb9ad872c3" containerName="s3-tls-init-serving" Apr 16 20:05:22.248639 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.248575 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="026cf755-d4b4-4854-a1d6-b2eb9ad872c3" containerName="s3-tls-init-serving" Apr 16 20:05:22.248687 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.248655 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="026cf755-d4b4-4854-a1d6-b2eb9ad872c3" containerName="s3-tls-init-serving" Apr 16 20:05:22.251781 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.251764 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" Apr 16 20:05:22.254387 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.254364 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t49kf\"" Apr 16 20:05:22.259324 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.259297 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5"] Apr 16 20:05:22.263707 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.263692 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" Apr 16 20:05:22.389150 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.389120 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5"] Apr 16 20:05:22.392536 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:05:22.392508 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56e8ba6f_897d_41e1_bd1b_2ebdb415fa4d.slice/crio-41f0da6937db445c411481fe4663780b5da43d0609ade2ed82b50bf762a4939b WatchSource:0}: Error finding container 41f0da6937db445c411481fe4663780b5da43d0609ade2ed82b50bf762a4939b: Status 404 returned error can't find the container with id 41f0da6937db445c411481fe4663780b5da43d0609ade2ed82b50bf762a4939b Apr 16 20:05:22.394492 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.394471 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:05:22.903691 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:22.903656 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" event={"ID":"56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d","Type":"ContainerStarted","Data":"41f0da6937db445c411481fe4663780b5da43d0609ade2ed82b50bf762a4939b"} Apr 16 20:05:23.909819 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:23.909788 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" event={"ID":"56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d","Type":"ContainerStarted","Data":"ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548"} Apr 16 20:05:23.910224 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:23.909997 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" Apr 16 20:05:23.911745 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:23.911723 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" Apr 16 20:05:23.930184 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:05:23.928690 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" podStartSLOduration=1.002395562 podStartE2EDuration="1.928670814s" podCreationTimestamp="2026-04-16 20:05:22 +0000 UTC" firstStartedPulling="2026-04-16 20:05:22.394621212 +0000 UTC m=+701.171198192" lastFinishedPulling="2026-04-16 20:05:23.320896468 +0000 UTC m=+702.097473444" observedRunningTime="2026-04-16 20:05:23.925528158 +0000 UTC m=+702.702105137" watchObservedRunningTime="2026-04-16 20:05:23.928670814 +0000 UTC m=+702.705247815" Apr 16 20:06:57.355393 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:57.355365 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-twkc5_56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d/kserve-container/0.log" Apr 16 20:06:57.486013 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:57.485983 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5"] Apr 16 20:06:57.486227 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:57.486192 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" podUID="56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d" containerName="kserve-container" containerID="cri-o://ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548" gracePeriod=30 Apr 16 20:06:57.726019 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:57.726001 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" Apr 16 20:06:58.171661 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.171622 2563 generic.go:358] "Generic (PLEG): container finished" podID="56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d" containerID="ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548" exitCode=2 Apr 16 20:06:58.171823 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.171674 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" event={"ID":"56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d","Type":"ContainerDied","Data":"ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548"} Apr 16 20:06:58.171823 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.171689 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" Apr 16 20:06:58.171823 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.171712 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5" event={"ID":"56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d","Type":"ContainerDied","Data":"41f0da6937db445c411481fe4663780b5da43d0609ade2ed82b50bf762a4939b"} Apr 16 20:06:58.171823 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.171728 2563 scope.go:117] "RemoveContainer" containerID="ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548" Apr 16 20:06:58.181022 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.180937 2563 scope.go:117] "RemoveContainer" containerID="ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548" Apr 16 20:06:58.181416 ip-10-0-135-244 kubenswrapper[2563]: E0416 20:06:58.181391 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548\": container with ID starting with ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548 not found: ID does not exist" containerID="ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548" Apr 16 20:06:58.181485 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.181424 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548"} err="failed to get container status \"ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548\": rpc error: code = NotFound desc = could not find container \"ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548\": container with ID starting with ac08c813d90f218f1068e0491f45b19642881a1e32d26bd4d4b65e8480dca548 not found: ID does not exist" Apr 16 20:06:58.188183 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.188160 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5"] Apr 16 20:06:58.194242 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:58.194221 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-twkc5"] Apr 16 20:06:59.798294 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:06:59.798260 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d" path="/var/lib/kubelet/pods/56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d/volumes" Apr 16 20:08:41.690322 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:08:41.690246 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:08:41.692518 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:08:41.692498 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:13:41.709297 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:13:41.709270 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:13:41.712084 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:13:41.712063 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:18:41.728317 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:18:41.728289 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:18:41.731116 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:18:41.731096 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:23:41.747576 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:23:41.747547 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:23:41.749862 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:23:41.749841 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:28:41.765107 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:28:41.765071 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:28:41.767770 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:28:41.767750 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:33:41.783824 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:33:41.783791 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:33:41.786758 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:33:41.786734 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:38:41.803177 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:38:41.803092 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:38:41.806425 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:38:41.806401 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:43:41.821133 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:43:41.821103 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:43:41.825650 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:43:41.825630 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:48:41.839159 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:48:41.839132 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:48:41.843609 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:48:41.843576 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:53:41.858023 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:53:41.857926 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:53:41.862133 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:53:41.862104 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:57:16.370921 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.370883 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pfjcf/must-gather-s4llb"] Apr 16 20:57:16.371391 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.371159 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d" containerName="kserve-container" Apr 16 20:57:16.371391 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.371172 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d" containerName="kserve-container" Apr 16 20:57:16.371391 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.371231 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="56e8ba6f-897d-41e1-bd1b-2ebdb415fa4d" containerName="kserve-container" Apr 16 20:57:16.374183 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.374163 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.377435 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.377417 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfjcf\"/\"openshift-service-ca.crt\"" Apr 16 20:57:16.377543 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.377420 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pfjcf\"/\"default-dockercfg-5kvmn\"" Apr 16 20:57:16.377724 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.377710 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfjcf\"/\"kube-root-ca.crt\"" Apr 16 20:57:16.385376 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.385358 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfjcf/must-gather-s4llb"] Apr 16 20:57:16.510716 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.510687 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a89d263b-e10a-4f9e-a9a0-900edd812513-must-gather-output\") pod \"must-gather-s4llb\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.510834 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.510736 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n28wh\" (UniqueName: \"kubernetes.io/projected/a89d263b-e10a-4f9e-a9a0-900edd812513-kube-api-access-n28wh\") pod \"must-gather-s4llb\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.611348 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.611322 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a89d263b-e10a-4f9e-a9a0-900edd812513-must-gather-output\") pod \"must-gather-s4llb\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.611494 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.611373 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n28wh\" (UniqueName: \"kubernetes.io/projected/a89d263b-e10a-4f9e-a9a0-900edd812513-kube-api-access-n28wh\") pod \"must-gather-s4llb\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.611673 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.611653 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a89d263b-e10a-4f9e-a9a0-900edd812513-must-gather-output\") pod \"must-gather-s4llb\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.619462 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.619432 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n28wh\" (UniqueName: \"kubernetes.io/projected/a89d263b-e10a-4f9e-a9a0-900edd812513-kube-api-access-n28wh\") pod \"must-gather-s4llb\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.698737 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.698673 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:16.812242 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.812218 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfjcf/must-gather-s4llb"] Apr 16 20:57:16.814964 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:57:16.814935 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda89d263b_e10a_4f9e_a9a0_900edd812513.slice/crio-c106d1e4732da949f9652581acae19a425c84e88ee054c75bf46cf9cc63eb744 WatchSource:0}: Error finding container c106d1e4732da949f9652581acae19a425c84e88ee054c75bf46cf9cc63eb744: Status 404 returned error can't find the container with id c106d1e4732da949f9652581acae19a425c84e88ee054c75bf46cf9cc63eb744 Apr 16 20:57:16.816665 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:16.816647 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:57:17.380380 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:17.380345 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfjcf/must-gather-s4llb" event={"ID":"a89d263b-e10a-4f9e-a9a0-900edd812513","Type":"ContainerStarted","Data":"c106d1e4732da949f9652581acae19a425c84e88ee054c75bf46cf9cc63eb744"} Apr 16 20:57:22.396039 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:22.396005 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfjcf/must-gather-s4llb" event={"ID":"a89d263b-e10a-4f9e-a9a0-900edd812513","Type":"ContainerStarted","Data":"7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880"} Apr 16 20:57:22.396039 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:22.396045 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfjcf/must-gather-s4llb" event={"ID":"a89d263b-e10a-4f9e-a9a0-900edd812513","Type":"ContainerStarted","Data":"ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f"} Apr 16 20:57:22.423398 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:22.423342 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pfjcf/must-gather-s4llb" podStartSLOduration=1.794571333 podStartE2EDuration="6.42332699s" podCreationTimestamp="2026-04-16 20:57:16 +0000 UTC" firstStartedPulling="2026-04-16 20:57:16.816821373 +0000 UTC m=+3815.593398350" lastFinishedPulling="2026-04-16 20:57:21.44557703 +0000 UTC m=+3820.222154007" observedRunningTime="2026-04-16 20:57:22.420393855 +0000 UTC m=+3821.196970858" watchObservedRunningTime="2026-04-16 20:57:22.42332699 +0000 UTC m=+3821.199903990" Apr 16 20:57:40.451500 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:40.451466 2563 generic.go:358] "Generic (PLEG): container finished" podID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerID="ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f" exitCode=0 Apr 16 20:57:40.451901 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:40.451541 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfjcf/must-gather-s4llb" event={"ID":"a89d263b-e10a-4f9e-a9a0-900edd812513","Type":"ContainerDied","Data":"ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f"} Apr 16 20:57:40.451901 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:40.451851 2563 scope.go:117] "RemoveContainer" containerID="ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f" Apr 16 20:57:41.332287 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:41.332257 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pfjcf_must-gather-s4llb_a89d263b-e10a-4f9e-a9a0-900edd812513/gather/0.log" Apr 16 20:57:44.785272 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:44.785244 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vpg7x_e922f21a-2e9d-4d74-9bbf-9f154ed71518/global-pull-secret-syncer/0.log" Apr 16 20:57:44.878913 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:44.878883 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pxn88_5488e199-2008-42c4-ab06-666d5ec0e2bf/konnectivity-agent/0.log" Apr 16 20:57:44.979693 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:44.979667 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-244.ec2.internal_c2fe7a6e5acf4d9e84afac6e0df862e1/haproxy/0.log" Apr 16 20:57:46.793770 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:46.793730 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pfjcf/must-gather-s4llb"] Apr 16 20:57:46.794155 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:46.793937 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-pfjcf/must-gather-s4llb" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerName="copy" containerID="cri-o://7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880" gracePeriod=2 Apr 16 20:57:46.799433 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:46.799405 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pfjcf/must-gather-s4llb"] Apr 16 20:57:47.019711 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.019691 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pfjcf_must-gather-s4llb_a89d263b-e10a-4f9e-a9a0-900edd812513/copy/0.log" Apr 16 20:57:47.020006 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.019993 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:47.022659 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.022634 2563 status_manager.go:895] "Failed to get status for pod" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" pod="openshift-must-gather-pfjcf/must-gather-s4llb" err="pods \"must-gather-s4llb\" is forbidden: User \"system:node:ip-10-0-135-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pfjcf\": no relationship found between node 'ip-10-0-135-244.ec2.internal' and this object" Apr 16 20:57:47.160909 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.160872 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n28wh\" (UniqueName: \"kubernetes.io/projected/a89d263b-e10a-4f9e-a9a0-900edd812513-kube-api-access-n28wh\") pod \"a89d263b-e10a-4f9e-a9a0-900edd812513\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " Apr 16 20:57:47.161048 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.160928 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a89d263b-e10a-4f9e-a9a0-900edd812513-must-gather-output\") pod \"a89d263b-e10a-4f9e-a9a0-900edd812513\" (UID: \"a89d263b-e10a-4f9e-a9a0-900edd812513\") " Apr 16 20:57:47.162320 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.162294 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89d263b-e10a-4f9e-a9a0-900edd812513-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a89d263b-e10a-4f9e-a9a0-900edd812513" (UID: "a89d263b-e10a-4f9e-a9a0-900edd812513"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:57:47.163061 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.163037 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89d263b-e10a-4f9e-a9a0-900edd812513-kube-api-access-n28wh" (OuterVolumeSpecName: "kube-api-access-n28wh") pod "a89d263b-e10a-4f9e-a9a0-900edd812513" (UID: "a89d263b-e10a-4f9e-a9a0-900edd812513"). InnerVolumeSpecName "kube-api-access-n28wh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:57:47.261922 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.261890 2563 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a89d263b-e10a-4f9e-a9a0-900edd812513-must-gather-output\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 20:57:47.261922 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.261917 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n28wh\" (UniqueName: \"kubernetes.io/projected/a89d263b-e10a-4f9e-a9a0-900edd812513-kube-api-access-n28wh\") on node \"ip-10-0-135-244.ec2.internal\" DevicePath \"\"" Apr 16 20:57:47.470899 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.470826 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pfjcf_must-gather-s4llb_a89d263b-e10a-4f9e-a9a0-900edd812513/copy/0.log" Apr 16 20:57:47.471140 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.471116 2563 generic.go:358] "Generic (PLEG): container finished" podID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerID="7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880" exitCode=143 Apr 16 20:57:47.471189 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.471164 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfjcf/must-gather-s4llb" Apr 16 20:57:47.471230 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.471209 2563 scope.go:117] "RemoveContainer" containerID="7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880" Apr 16 20:57:47.475711 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.475682 2563 status_manager.go:895] "Failed to get status for pod" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" pod="openshift-must-gather-pfjcf/must-gather-s4llb" err="pods \"must-gather-s4llb\" is forbidden: User \"system:node:ip-10-0-135-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pfjcf\": no relationship found between node 'ip-10-0-135-244.ec2.internal' and this object" Apr 16 20:57:47.478979 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.478952 2563 scope.go:117] "RemoveContainer" containerID="ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f" Apr 16 20:57:47.481743 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.481715 2563 status_manager.go:895] "Failed to get status for pod" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" pod="openshift-must-gather-pfjcf/must-gather-s4llb" err="pods \"must-gather-s4llb\" is forbidden: User \"system:node:ip-10-0-135-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pfjcf\": no relationship found between node 'ip-10-0-135-244.ec2.internal' and this object" Apr 16 20:57:47.489107 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.489089 2563 scope.go:117] "RemoveContainer" containerID="7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880" Apr 16 20:57:47.489355 ip-10-0-135-244 kubenswrapper[2563]: E0416 20:57:47.489336 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880\": container with ID starting with 7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880 not found: ID does not exist" containerID="7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880" Apr 16 20:57:47.489427 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.489363 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880"} err="failed to get container status \"7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880\": rpc error: code = NotFound desc = could not find container \"7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880\": container with ID starting with 7bd2fa3b08c177a15e2ae7d6f6a762ec2763142a78f044d10a46211b57588880 not found: ID does not exist" Apr 16 20:57:47.489427 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.489379 2563 scope.go:117] "RemoveContainer" containerID="ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f" Apr 16 20:57:47.489663 ip-10-0-135-244 kubenswrapper[2563]: E0416 20:57:47.489641 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f\": container with ID starting with ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f not found: ID does not exist" containerID="ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f" Apr 16 20:57:47.489722 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.489669 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f"} err="failed to get container status \"ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f\": rpc error: code = NotFound desc = could not find container \"ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f\": container with ID starting with ea18f53e8b6c0dd9046634bd4a777e7e37bad4c2791889faf64c454b8065717f not found: ID does not exist" Apr 16 20:57:47.798257 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:47.798181 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" path="/var/lib/kubelet/pods/a89d263b-e10a-4f9e-a9a0-900edd812513/volumes" Apr 16 20:57:48.656619 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:48.656560 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5wkvv_519fe37e-7387-44fb-bec2-9430b0c20e29/node-exporter/0.log" Apr 16 20:57:48.674510 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:48.674490 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5wkvv_519fe37e-7387-44fb-bec2-9430b0c20e29/kube-rbac-proxy/0.log" Apr 16 20:57:48.696241 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:48.696220 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5wkvv_519fe37e-7387-44fb-bec2-9430b0c20e29/init-textfile/0.log" Apr 16 20:57:48.956194 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:48.956164 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2226d070-e026-4dbf-8dad-527a1aa3eb7d/prometheus/0.log" Apr 16 20:57:48.973480 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:48.973458 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2226d070-e026-4dbf-8dad-527a1aa3eb7d/config-reloader/0.log" Apr 16 20:57:48.994578 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:48.994557 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2226d070-e026-4dbf-8dad-527a1aa3eb7d/thanos-sidecar/0.log" Apr 16 20:57:49.014483 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:49.014466 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2226d070-e026-4dbf-8dad-527a1aa3eb7d/kube-rbac-proxy-web/0.log" Apr 16 20:57:49.037489 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:49.037469 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2226d070-e026-4dbf-8dad-527a1aa3eb7d/kube-rbac-proxy/0.log" Apr 16 20:57:49.059556 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:49.059534 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2226d070-e026-4dbf-8dad-527a1aa3eb7d/kube-rbac-proxy-thanos/0.log" Apr 16 20:57:49.078934 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:49.078919 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2226d070-e026-4dbf-8dad-527a1aa3eb7d/init-config-reloader/0.log" Apr 16 20:57:49.151037 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:49.151007 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-d2rzb_abe80e33-639d-48ee-a5c7-d2a276c94434/prometheus-operator-admission-webhook/0.log" Apr 16 20:57:51.413781 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:51.413739 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-vjnr7_9f502296-7bae-46a2-93aa-fb3effb2035b/download-server/0.log" Apr 16 20:57:52.160110 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.160075 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f"] Apr 16 20:57:52.160361 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.160349 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerName="gather" Apr 16 20:57:52.160405 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.160362 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerName="gather" Apr 16 20:57:52.160405 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.160370 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerName="copy" Apr 16 20:57:52.160405 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.160375 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerName="copy" Apr 16 20:57:52.160502 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.160440 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerName="copy" Apr 16 20:57:52.160502 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.160451 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a89d263b-e10a-4f9e-a9a0-900edd812513" containerName="gather" Apr 16 20:57:52.165517 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.165493 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.168217 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.168182 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x4l79\"/\"default-dockercfg-4rfsr\"" Apr 16 20:57:52.169370 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.169351 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x4l79\"/\"kube-root-ca.crt\"" Apr 16 20:57:52.169487 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.169408 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x4l79\"/\"openshift-service-ca.crt\"" Apr 16 20:57:52.170034 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.170004 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f"] Apr 16 20:57:52.198821 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.198788 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-sys\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.198821 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.198823 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-proc\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.199037 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.198846 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-podres\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.199037 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.198954 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-lib-modules\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.199037 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.199003 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvm4\" (UniqueName: \"kubernetes.io/projected/7543fcac-9454-4921-ad34-cf24c6774c25-kube-api-access-blvm4\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300229 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300194 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-lib-modules\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300229 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300234 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blvm4\" (UniqueName: \"kubernetes.io/projected/7543fcac-9454-4921-ad34-cf24c6774c25-kube-api-access-blvm4\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300448 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300270 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-sys\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300448 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300288 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-proc\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300448 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300306 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-podres\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300448 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300409 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-podres\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300448 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300404 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-lib-modules\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300448 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300409 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-sys\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.300722 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.300409 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7543fcac-9454-4921-ad34-cf24c6774c25-proc\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.308301 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.308279 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvm4\" (UniqueName: \"kubernetes.io/projected/7543fcac-9454-4921-ad34-cf24c6774c25-kube-api-access-blvm4\") pod \"perf-node-gather-daemonset-htl2f\" (UID: \"7543fcac-9454-4921-ad34-cf24c6774c25\") " pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.453676 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.453575 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dtqvs_133663ab-a7a5-4f8a-8659-5dcb18604eed/dns/0.log" Apr 16 20:57:52.472780 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.472755 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dtqvs_133663ab-a7a5-4f8a-8659-5dcb18604eed/kube-rbac-proxy/0.log" Apr 16 20:57:52.476567 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.476552 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:52.597275 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.597240 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f"] Apr 16 20:57:52.601563 ip-10-0-135-244 kubenswrapper[2563]: W0416 20:57:52.601537 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7543fcac_9454_4921_ad34_cf24c6774c25.slice/crio-d5fa258cc9a738cb8bc95589616a75e4a2f7d69b2bbc90c2c25f1ffd31a4f452 WatchSource:0}: Error finding container d5fa258cc9a738cb8bc95589616a75e4a2f7d69b2bbc90c2c25f1ffd31a4f452: Status 404 returned error can't find the container with id d5fa258cc9a738cb8bc95589616a75e4a2f7d69b2bbc90c2c25f1ffd31a4f452 Apr 16 20:57:52.623774 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:52.623751 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6hmqh_fdcd8eab-4705-4045-bbc6-5974072ac6dd/dns-node-resolver/0.log" Apr 16 20:57:53.091443 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:53.091364 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5b8bff98b5-kb9d6_a79633d3-d516-48b7-b2f8-86c85d66903a/registry/0.log" Apr 16 20:57:53.176064 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:53.176028 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nm7vd_6eefa0ff-7de4-4c45-af84-a83e70151ad6/node-ca/0.log" Apr 16 20:57:53.491191 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:53.491154 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" event={"ID":"7543fcac-9454-4921-ad34-cf24c6774c25","Type":"ContainerStarted","Data":"ca755eb41b90a168a9ce53000c5156e0e5acb69d625093671cddb3347f34fc39"} Apr 16 20:57:53.491191 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:53.491193 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" event={"ID":"7543fcac-9454-4921-ad34-cf24c6774c25","Type":"ContainerStarted","Data":"d5fa258cc9a738cb8bc95589616a75e4a2f7d69b2bbc90c2c25f1ffd31a4f452"} Apr 16 20:57:53.491650 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:53.491294 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:57:53.509966 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:53.509925 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" podStartSLOduration=1.5099143659999998 podStartE2EDuration="1.509914366s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:57:53.508970391 +0000 UTC m=+3852.285547389" watchObservedRunningTime="2026-04-16 20:57:53.509914366 +0000 UTC m=+3852.286491365" Apr 16 20:57:54.253966 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:54.253916 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8q4g7_02f069bd-5606-4bac-9784-8646fdf8c979/serve-healthcheck-canary/0.log" Apr 16 20:57:54.813724 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:54.813698 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9tsr8_0200234c-4441-4ee0-a6b1-e543a08da9b8/kube-rbac-proxy/0.log" Apr 16 20:57:54.846272 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:54.846239 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9tsr8_0200234c-4441-4ee0-a6b1-e543a08da9b8/exporter/0.log" Apr 16 20:57:54.869373 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:54.869353 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9tsr8_0200234c-4441-4ee0-a6b1-e543a08da9b8/extractor/0.log" Apr 16 20:57:56.970857 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:56.970820 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-659c8cbdc-zcv4s_81118176-8453-47ec-aebb-16dbdb36e543/manager/0.log" Apr 16 20:57:57.284309 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:57.284232 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-zk4jn_186fa644-f8cd-4d1f-af46-25ac0647fcc4/s3-init/0.log" Apr 16 20:57:57.309764 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:57.309730 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-gphjt_6ed4e661-f1a5-4338-afbe-ee7ef4fd7206/s3-tls-init-custom/0.log" Apr 16 20:57:57.331531 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:57.331509 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-54ggl_026cf755-d4b4-4854-a1d6-b2eb9ad872c3/s3-tls-init-serving/0.log" Apr 16 20:57:57.362902 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:57.362879 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-twnvx_2dd12e1f-5557-40d7-8f6d-72087eb846db/seaweedfs/0.log" Apr 16 20:57:57.408950 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:57.408918 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-kgj7c_4c355f66-b85a-42de-b606-32849f924a8d/seaweedfs-tls-serving/0.log" Apr 16 20:57:59.503384 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:57:59.503348 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x4l79/perf-node-gather-daemonset-htl2f" Apr 16 20:58:01.108742 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:01.108668 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d6h88_1ad38f3c-cccf-4846-8c4c-864918ab774f/migrator/0.log" Apr 16 20:58:01.128566 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:01.128543 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d6h88_1ad38f3c-cccf-4846-8c4c-864918ab774f/graceful-termination/0.log" Apr 16 20:58:02.779426 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.779397 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqw9q_fedbf08e-3ecd-47fe-bbea-4ca1def89a98/kube-multus-additional-cni-plugins/0.log" Apr 16 20:58:02.801067 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.801044 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqw9q_fedbf08e-3ecd-47fe-bbea-4ca1def89a98/egress-router-binary-copy/0.log" Apr 16 20:58:02.823456 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.823398 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqw9q_fedbf08e-3ecd-47fe-bbea-4ca1def89a98/cni-plugins/0.log" Apr 16 20:58:02.842753 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.842735 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqw9q_fedbf08e-3ecd-47fe-bbea-4ca1def89a98/bond-cni-plugin/0.log" Apr 16 20:58:02.863209 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.863192 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqw9q_fedbf08e-3ecd-47fe-bbea-4ca1def89a98/routeoverride-cni/0.log" Apr 16 20:58:02.883688 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.883667 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqw9q_fedbf08e-3ecd-47fe-bbea-4ca1def89a98/whereabouts-cni-bincopy/0.log" Apr 16 20:58:02.905844 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.905825 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vqw9q_fedbf08e-3ecd-47fe-bbea-4ca1def89a98/whereabouts-cni/0.log" Apr 16 20:58:02.982643 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:02.982622 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rjds9_b0c44e61-db3b-4f44-bfc4-d928140603e4/kube-multus/0.log" Apr 16 20:58:03.138982 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:03.138953 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p54df_81d750f7-8363-48b6-afd3-9847607883b7/network-metrics-daemon/0.log" Apr 16 20:58:03.156844 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:03.156822 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p54df_81d750f7-8363-48b6-afd3-9847607883b7/kube-rbac-proxy/0.log" Apr 16 20:58:04.314376 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.314348 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-controller/0.log" Apr 16 20:58:04.330369 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.330345 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/0.log" Apr 16 20:58:04.362893 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.362871 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovn-acl-logging/1.log" Apr 16 20:58:04.381983 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.381960 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/kube-rbac-proxy-node/0.log" Apr 16 20:58:04.403913 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.403891 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:58:04.423100 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.423078 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/northd/0.log" Apr 16 20:58:04.443018 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.442993 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/nbdb/0.log" Apr 16 20:58:04.463894 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.463871 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/sbdb/0.log" Apr 16 20:58:04.622276 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:04.622250 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8s7w4_5037aa30-5243-46c1-9238-71a0ee0cc436/ovnkube-controller/0.log" Apr 16 20:58:05.980545 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:05.980515 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-chnql_c7e55932-e28c-4952-86fc-0a2e235083be/network-check-target-container/0.log" Apr 16 20:58:07.024796 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:07.024745 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-sg2qr_4a94789b-3b6e-4baf-b5d0-edbc3d2d18cb/iptables-alerter/0.log" Apr 16 20:58:07.741713 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:07.741682 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tbhz7_41cdc007-f7dd-4cb0-9634-b6c0f69c8ff4/tuned/0.log" Apr 16 20:58:10.744786 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:10.744753 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-stfgk_99245d8b-f916-4b7b-907b-c110d66c41c9/service-ca-controller/0.log" Apr 16 20:58:11.120042 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:11.120015 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-64nb2_940f6882-9538-4742-9cdc-585d4ceabae6/csi-driver/0.log" Apr 16 20:58:11.141058 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:11.141040 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-64nb2_940f6882-9538-4742-9cdc-585d4ceabae6/csi-node-driver-registrar/0.log" Apr 16 20:58:11.162281 ip-10-0-135-244 kubenswrapper[2563]: I0416 20:58:11.162261 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-64nb2_940f6882-9538-4742-9cdc-585d4ceabae6/csi-liveness-probe/0.log"