Apr 20 14:51:01.218972 ip-10-0-129-82 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 14:51:01.218986 ip-10-0-129-82 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 14:51:01.218996 ip-10-0-129-82 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 14:51:01.219328 ip-10-0-129-82 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 14:51:11.304181 ip-10-0-129-82 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 14:51:11.304196 ip-10-0-129-82 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0dd90609031344dfb1f7bc52a10194e8 -- Apr 20 14:53:28.134804 ip-10-0-129-82 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:53:28.640221 ip-10-0-129-82 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:28.640221 ip-10-0-129-82 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:53:28.640221 ip-10-0-129-82 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:28.640221 ip-10-0-129-82 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:53:28.640221 ip-10-0-129-82 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:28.642925 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.642839 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:53:28.646072 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646057 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:28.646072 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646072 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646077 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646080 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646083 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646086 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646090 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646092 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646095 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646098 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646101 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646103 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646106 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646109 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646111 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646114 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646117 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646125 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646128 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646131 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646133 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:28.646129 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646136 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646139 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646142 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646145 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646147 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646150 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646153 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646155 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646157 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646160 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646162 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646165 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646170 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646174 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646177 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646180 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646183 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646185 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646188 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:28.646605 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646190 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646193 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646195 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646198 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646201 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646203 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646206 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646209 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646211 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646214 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646217 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646219 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646222 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646225 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646227 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646230 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646233 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646235 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646238 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:28.647271 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646241 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646244 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646247 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646250 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646253 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646255 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646258 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646260 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646263 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646265 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646268 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646271 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646273 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646276 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646278 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646281 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646283 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646286 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646288 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646291 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:28.647894 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646294 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646297 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646299 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646302 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646305 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646307 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.646310 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648000 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648014 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648017 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648021 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648023 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648027 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648031 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648034 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648037 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648040 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648044 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648047 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648049 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:28.648366 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648052 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648055 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648058 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648060 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648063 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648065 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648069 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648073 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648077 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648080 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648083 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648086 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648089 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648092 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648096 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648099 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648102 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648105 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648107 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:28.648872 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648110 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648114 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648117 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648120 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648122 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648125 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648128 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648132 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648134 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648137 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648140 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648142 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648145 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648148 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648150 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648153 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648155 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648159 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648162 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648164 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:28.649418 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648167 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648169 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648172 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648174 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648177 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648179 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648182 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648184 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648187 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648189 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648191 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648194 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648198 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648201 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648204 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648206 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648209 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648213 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648215 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648218 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:28.649934 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648220 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648223 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648225 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648228 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648230 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648233 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648242 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648245 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648248 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648250 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648252 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648255 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648257 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.648260 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649014 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649024 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649031 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649035 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649040 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649044 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649048 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:53:28.650406 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649053 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649056 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649059 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649063 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649066 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649070 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649073 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649075 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649078 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649081 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649084 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649087 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649092 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649095 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649098 2575 flags.go:64] FLAG: --config-dir="" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649101 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649104 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649108 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649112 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649115 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649118 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649121 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649124 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649127 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649130 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:53:28.650919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649134 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649138 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649141 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649143 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649146 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649149 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649152 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649156 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649159 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649162 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649165 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649168 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649173 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649176 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649178 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649181 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649184 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649187 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649190 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649193 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649195 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649198 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649201 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649205 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649208 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:53:28.651540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649211 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649216 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649219 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649223 2575 flags.go:64] FLAG: --help="false" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649225 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649229 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649231 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649234 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649238 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649241 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649244 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649247 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649249 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649252 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649255 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649258 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649261 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649264 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649267 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649270 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649273 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649276 2575 flags.go:64] FLAG: --lock-file="" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649279 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649282 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:53:28.652136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649285 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649290 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649293 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649295 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649298 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649301 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649304 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649307 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649310 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649314 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649318 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649322 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649325 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649328 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649331 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649333 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649336 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649340 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649343 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649350 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649353 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649356 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649359 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:53:28.652739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649362 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649367 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649369 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649372 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649375 2575 flags.go:64] FLAG: --port="10250" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649378 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649381 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0db09ba3ec235eb6f" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649384 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649387 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649390 2575 flags.go:64] FLAG: --register-node="true" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649393 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649395 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649399 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649402 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649405 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649408 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649412 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649415 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649419 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649422 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649426 2575 flags.go:64] FLAG: --runonce="false" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649428 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649432 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649435 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649438 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649441 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:53:28.653284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649444 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649447 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649450 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649453 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649456 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649459 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649462 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649464 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649467 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649470 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649476 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649479 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649482 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649486 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649489 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649492 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649495 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649498 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649500 2575 flags.go:64] FLAG: --v="2" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649505 2575 flags.go:64] FLAG: --version="false" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649521 2575 flags.go:64] FLAG: --vmodule="" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649526 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.649529 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649645 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:28.653921 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649649 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649652 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649655 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649659 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649662 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649666 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649669 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649672 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649675 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649678 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649680 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649683 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649686 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649688 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649690 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649693 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649696 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649698 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649701 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649703 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:28.654542 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649706 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649708 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649711 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649713 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649716 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649719 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649721 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649724 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649727 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649730 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649732 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649735 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649737 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649740 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649742 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649745 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649748 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649750 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649754 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649758 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:28.655040 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649761 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649764 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649766 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649769 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649771 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649774 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649776 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649778 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649781 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649783 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649786 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649788 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649790 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649793 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649795 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649798 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649800 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649803 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649805 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:28.655541 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649807 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649810 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649812 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649815 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649820 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649824 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649827 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649830 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649833 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649837 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649841 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649844 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649846 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649849 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649852 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649855 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649857 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649859 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649862 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:28.655996 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649864 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:28.656783 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649867 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:28.656783 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649869 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:28.656783 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649871 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:28.656783 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649874 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:28.656783 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649877 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:28.656783 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.649879 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:28.656783 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.650624 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:28.658475 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.658453 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:53:28.658475 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.658475 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658560 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658569 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658573 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658578 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658583 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658587 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658591 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658595 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658598 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658602 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658606 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658610 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658613 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658617 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658621 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658625 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658629 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658633 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658637 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:28.658727 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658640 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658644 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658648 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658652 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658658 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658663 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658667 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658671 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658675 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658679 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658684 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658687 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658692 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658696 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658701 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658706 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658710 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658715 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658719 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:28.659601 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658723 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658727 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658731 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658735 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658739 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658744 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658747 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658751 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658755 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658759 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658764 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658768 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658772 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658777 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658780 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658785 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658789 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658792 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658796 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658800 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:28.660180 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658804 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658808 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658812 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658817 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658821 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658825 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658841 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658847 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658851 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658855 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658859 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658863 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658867 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658871 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658876 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658880 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658884 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658888 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658892 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658896 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:28.660960 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658900 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658904 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658908 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658912 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658919 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658925 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658930 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.658934 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.658942 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659100 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659108 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659112 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659117 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659122 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659126 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:28.661792 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659131 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659137 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659141 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659145 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659150 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659155 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659159 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659163 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659167 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659171 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659175 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659179 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659183 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659187 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659192 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659196 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659200 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659206 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659213 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:28.662173 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659217 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659222 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659226 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659230 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659234 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659238 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659243 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659247 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659251 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659256 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659260 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659265 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659269 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659273 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659277 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659281 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659286 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659290 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659296 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659300 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:28.662773 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659304 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659308 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659312 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659316 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659320 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659324 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659328 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659332 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659336 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659341 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659348 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659353 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659357 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659361 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659365 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659369 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659373 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659378 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659381 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:28.663243 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659385 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659389 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659393 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659397 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659401 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659405 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659409 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659413 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659417 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659422 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659426 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659430 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659435 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659439 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659443 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659447 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659451 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659454 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659459 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659463 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:28.663803 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659466 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:28.664317 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:28.659470 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:28.664317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.659478 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:28.664317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.659650 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:53:28.664317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.662379 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:53:28.664317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.663976 2575 server.go:1019] "Starting client certificate rotation" Apr 20 14:53:28.664317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.664077 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:53:28.664317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.664115 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:53:28.696524 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.696496 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:53:28.699278 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.699254 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:53:28.712088 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.712065 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:53:28.718667 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.718646 2575 log.go:25] "Validated CRI v1 image API" Apr 20 14:53:28.719919 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.719894 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:53:28.725256 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.725235 2575 fs.go:135] Filesystem UUIDs: map[26d115e0-3f7f-4af6-93e4-8ab8296e1db7:/dev/nvme0n1p4 44b865b8-b077-40f2-81b6-72b018d42f4f:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 14:53:28.725305 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.725256 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:53:28.729252 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.729229 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:53:28.731833 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.731726 2575 manager.go:217] Machine: {Timestamp:2026-04-20 14:53:28.730045933 +0000 UTC m=+0.461364211 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3205127 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2317788be5f9aaccefb5ff84f90de3 SystemUUID:ec231778-8be5-f9aa-ccef-b5ff84f90de3 BootID:0dd90609-0313-44df-b1f7-bc52a10194e8 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:89:e1:97:bd:2b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:89:e1:97:bd:2b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:71:7b:9d:ab:7f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:53:28.731833 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.731827 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:53:28.731929 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.731917 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:53:28.734324 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.734297 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:53:28.734488 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.734325 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-82.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:53:28.734547 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.734497 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:53:28.734547 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.734506 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:53:28.734547 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.734530 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:53:28.734633 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.734550 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:53:28.735765 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.735755 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:53:28.735867 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.735859 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:53:28.739268 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.739257 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:53:28.739308 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.739273 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:53:28.739308 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.739285 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:53:28.739308 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.739295 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:53:28.739417 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.739315 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:53:28.740754 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.740739 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:53:28.740797 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.740767 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:53:28.744350 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.744334 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:53:28.745634 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.745621 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:53:28.747609 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747597 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:53:28.747657 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747615 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:53:28.747657 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747621 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:53:28.747657 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747627 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:53:28.747657 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747635 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:53:28.747657 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747642 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:53:28.747657 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747656 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:53:28.747800 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747661 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:53:28.747800 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747669 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:53:28.747800 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747675 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:53:28.747800 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747688 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:53:28.747800 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.747697 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:53:28.748990 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.748979 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:53:28.749022 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.748991 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:53:28.752897 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.752882 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:53:28.752958 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.752922 2575 server.go:1295] "Started kubelet" Apr 20 14:53:28.753056 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.753016 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:53:28.753472 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.753411 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:53:28.753614 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.753491 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:53:28.753683 ip-10-0-129-82 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:53:28.753955 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.753936 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-82.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 14:53:28.757363 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.757339 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:53:28.758353 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.758327 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-82.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:53:28.758442 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.758343 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:53:28.759325 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.759307 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:53:28.761735 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.761712 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:53:28.762318 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.762298 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:53:28.763059 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.762921 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:53:28.763059 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.763014 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:53:28.763059 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.763029 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:53:28.763308 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.763099 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:53:28.763308 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.763108 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:53:28.763399 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.763372 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:53:28.763399 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.763384 2575 factory.go:55] Registering systemd factory Apr 20 14:53:28.763399 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.763391 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:53:28.763611 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.763590 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:28.764401 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.764383 2575 factory.go:153] Registering CRI-O factory Apr 20 14:53:28.764401 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.764405 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 14:53:28.764566 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.764423 2575 factory.go:103] Registering Raw factory Apr 20 14:53:28.764566 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.764437 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 14:53:28.764881 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.764863 2575 manager.go:319] Starting recovery of all containers Apr 20 14:53:28.770430 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.770404 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-82.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 14:53:28.770535 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.770479 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 14:53:28.772125 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.770584 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-82.ec2.internal.18a818558f49537f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-82.ec2.internal,UID:ip-10-0-129-82.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-82.ec2.internal,},FirstTimestamp:2026-04-20 14:53:28.752894847 +0000 UTC m=+0.484213124,LastTimestamp:2026-04-20 14:53:28.752894847 +0000 UTC m=+0.484213124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-82.ec2.internal,}" Apr 20 14:53:28.773624 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.773492 2575 manager.go:324] Recovery completed Apr 20 14:53:28.779048 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.779035 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:28.781975 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.781951 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:28.782034 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.781989 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:28.782034 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.782000 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:28.782469 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.782456 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:53:28.782535 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.782470 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:53:28.782535 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.782487 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:53:28.783926 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.783864 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-82.ec2.internal.18a818559105079d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-82.ec2.internal,UID:ip-10-0-129-82.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-82.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-82.ec2.internal,},FirstTimestamp:2026-04-20 14:53:28.781973405 +0000 UTC m=+0.513291686,LastTimestamp:2026-04-20 14:53:28.781973405 +0000 UTC m=+0.513291686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-82.ec2.internal,}" Apr 20 14:53:28.785851 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.785836 2575 policy_none.go:49] "None policy: Start" Apr 20 14:53:28.785851 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.785853 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:53:28.785935 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.785863 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:53:28.788771 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.788736 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wbm79" Apr 20 14:53:28.796361 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.796345 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wbm79" Apr 20 14:53:28.796475 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.796385 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-82.ec2.internal.18a81855910559be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-82.ec2.internal,UID:ip-10-0-129-82.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-82.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-82.ec2.internal,},FirstTimestamp:2026-04-20 14:53:28.78199443 +0000 UTC m=+0.513312708,LastTimestamp:2026-04-20 14:53:28.78199443 +0000 UTC m=+0.513312708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-82.ec2.internal,}" Apr 20 14:53:28.813096 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.813077 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 14:53:28.813185 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.813115 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:53:28.813185 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.813128 2575 server.go:85] "Starting device plugin registration server" Apr 20 14:53:28.813371 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.813359 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:53:28.813440 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.813374 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:53:28.813488 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.813451 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:53:28.813566 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.813554 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:53:28.813623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.813566 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:53:28.814060 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.814038 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:53:28.814124 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.814085 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:28.858363 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.858322 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:53:28.860620 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.859417 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:53:28.860620 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.859441 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:53:28.860620 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.859471 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:53:28.860620 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.859481 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:53:28.860620 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.859536 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:53:28.862985 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.862967 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:28.914014 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.913961 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:28.914976 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.914961 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:28.915059 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.914989 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:28.915059 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.915000 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:28.915059 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.915023 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.925385 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.925369 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.925428 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.925390 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-82.ec2.internal\": node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:28.938161 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.938142 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:28.959609 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.959586 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal"] Apr 20 14:53:28.959662 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.959650 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:28.960345 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.960331 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:28.960403 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.960357 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:28.960403 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.960370 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:28.962689 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.962677 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:28.962842 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.962829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.962880 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.962857 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:28.963246 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/351da3143bd95e36e22d1ccb0700c673-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal\" (UID: \"351da3143bd95e36e22d1ccb0700c673\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.963351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/351da3143bd95e36e22d1ccb0700c673-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal\" (UID: \"351da3143bd95e36e22d1ccb0700c673\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.963351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c510f351159c758475778f44c8d7da56-config\") pod \"kube-apiserver-proxy-ip-10-0-129-82.ec2.internal\" (UID: \"c510f351159c758475778f44c8d7da56\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.963351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963304 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:28.963351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963325 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:28.963351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963339 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:28.963547 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963372 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:28.963547 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963394 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:28.963547 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.963403 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:28.966071 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.966058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" Apr 20 14:53:28.966124 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.966081 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:28.966687 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.966666 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:28.966756 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.966692 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:28.966756 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:28.966706 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:28.996162 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:28.996144 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-82.ec2.internal\" not found" node="ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.000309 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.000295 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-82.ec2.internal\" not found" node="ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.038579 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.038555 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.063626 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.063603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/351da3143bd95e36e22d1ccb0700c673-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal\" (UID: \"351da3143bd95e36e22d1ccb0700c673\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.063709 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.063630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/351da3143bd95e36e22d1ccb0700c673-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal\" (UID: \"351da3143bd95e36e22d1ccb0700c673\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.063709 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.063648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c510f351159c758475778f44c8d7da56-config\") pod \"kube-apiserver-proxy-ip-10-0-129-82.ec2.internal\" (UID: \"c510f351159c758475778f44c8d7da56\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.063709 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.063678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/351da3143bd95e36e22d1ccb0700c673-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal\" (UID: \"351da3143bd95e36e22d1ccb0700c673\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.063709 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.063683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/351da3143bd95e36e22d1ccb0700c673-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal\" (UID: \"351da3143bd95e36e22d1ccb0700c673\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.063827 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.063713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c510f351159c758475778f44c8d7da56-config\") pod \"kube-apiserver-proxy-ip-10-0-129-82.ec2.internal\" (UID: \"c510f351159c758475778f44c8d7da56\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.138738 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.138704 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.239319 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.239259 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.297840 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.297814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.302237 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.302219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.339884 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.339860 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.440354 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.440317 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.540810 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.540753 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.641341 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.641302 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.663478 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.663456 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:53:29.663623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.663596 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:53:29.742030 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.742003 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.761952 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.761933 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:53:29.782336 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.782310 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:53:29.796054 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.796003 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:29.799061 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.799040 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:48:28 +0000 UTC" deadline="2027-10-06 10:31:00.334724343 +0000 UTC" Apr 20 14:53:29.799107 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.799064 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12811h37m30.535665409s" Apr 20 14:53:29.829541 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.829505 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vfrlx" Apr 20 14:53:29.836213 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.836193 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vfrlx" Apr 20 14:53:29.842599 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:29.842579 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-82.ec2.internal\" not found" Apr 20 14:53:29.920564 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.920541 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:29.962917 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.962889 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.976248 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.976223 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:53:29.977928 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.977913 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" Apr 20 14:53:29.992260 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:29.992237 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:53:30.012211 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.012192 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:30.047659 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:30.047601 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc510f351159c758475778f44c8d7da56.slice/crio-ae254a4b70e38aed87986c2712f766844a25745fae4174f873798dffb8bb0a76 WatchSource:0}: Error finding container ae254a4b70e38aed87986c2712f766844a25745fae4174f873798dffb8bb0a76: Status 404 returned error can't find the container with id ae254a4b70e38aed87986c2712f766844a25745fae4174f873798dffb8bb0a76 Apr 20 14:53:30.048086 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:30.048066 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351da3143bd95e36e22d1ccb0700c673.slice/crio-434b45a81a67e4d105e1042a3972d953a33da7346d76dbcefd5ad9dd61638164 WatchSource:0}: Error finding container 434b45a81a67e4d105e1042a3972d953a33da7346d76dbcefd5ad9dd61638164: Status 404 returned error can't find the container with id 434b45a81a67e4d105e1042a3972d953a33da7346d76dbcefd5ad9dd61638164 Apr 20 14:53:30.053301 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.053285 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:53:30.740914 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.740883 2575 apiserver.go:52] "Watching apiserver" Apr 20 14:53:30.750219 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.750193 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:53:30.750638 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.750610 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vbpm4","openshift-network-diagnostics/network-check-target-qmkvh","kube-system/konnectivity-agent-w4psc","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69","openshift-cluster-node-tuning-operator/tuned-2mklb","openshift-network-operator/iptables-alerter-vw8ch","openshift-ovn-kubernetes/ovnkube-node-q8mhl","kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal","openshift-image-registry/node-ca-9vx92","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal","openshift-multus/multus-59qmp","openshift-multus/multus-additional-cni-plugins-wzx48"] Apr 20 14:53:30.753650 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.753626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.755855 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.755825 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:30.755951 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.755899 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:30.756446 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.756421 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7pr4z\"" Apr 20 14:53:30.756699 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.756576 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:53:30.756699 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.756587 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:53:30.756699 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.756669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:53:30.758134 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.758031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:30.758134 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.758100 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:30.760596 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.760406 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:30.762692 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.762675 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5267v\"" Apr 20 14:53:30.762909 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.762734 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:53:30.763157 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.763139 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:53:30.765308 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.765289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.765421 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.765394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.768302 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.768227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.769762 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:30.769844 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-kubelet-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.769844 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-etc-selinux\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.769844 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n79m6\" (UniqueName: \"kubernetes.io/projected/f8c083d6-bb9f-4c22-bcef-e04d04e09740-kube-api-access-n79m6\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.769987 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0871071e-e935-405e-8b82-b08123f1734d-etc-tuned\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.769987 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkd82\" (UniqueName: \"kubernetes.io/projected/aaf83337-5403-4bd0-b782-5d5fa014368f-kube-api-access-rkd82\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:30.769987 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4236e7c-46e3-443b-9430-39ff80fbd8dc-konnectivity-ca\") pod \"konnectivity-agent-w4psc\" (UID: \"f4236e7c-46e3-443b-9430-39ff80fbd8dc\") " pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:30.769987 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-registration-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.769987 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysctl-conf\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.769987 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-systemd\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.769986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-run\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-lib-modules\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-host\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysconfig\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b2dcedf-7d66-473e-a794-d90c68ccd475-iptables-alerter-script\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-kubernetes\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8hvm\" (UniqueName: \"kubernetes.io/projected/0871071e-e935-405e-8b82-b08123f1734d-kube-api-access-l8hvm\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770175 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770197 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-device-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.770253 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770240 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770260 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-modprobe-d\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0871071e-e935-405e-8b82-b08123f1734d-tmp\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770076 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zblc8\" (UniqueName: \"kubernetes.io/projected/9b2dcedf-7d66-473e-a794-d90c68ccd475-kube-api-access-zblc8\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-socket-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-sys-fs\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysctl-d\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-var-lib-kubelet\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770489 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b2dcedf-7d66-473e-a794-d90c68ccd475-host-slash\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gc8mh\"" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4236e7c-46e3-443b-9430-39ff80fbd8dc-agent-certs\") pod \"konnectivity-agent-w4psc\" (UID: \"f4236e7c-46e3-443b-9430-39ff80fbd8dc\") " pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h95c4\"" Apr 20 14:53:30.770731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-sys\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.771366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.770938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.771366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.771013 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:53:30.771366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.771156 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:53:30.771366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.771270 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:53:30.771366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.771294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:53:30.771366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.771313 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-f2g4l\"" Apr 20 14:53:30.771366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.771272 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:53:30.773177 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.773159 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.773797 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.773782 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6rxz6\"" Apr 20 14:53:30.773939 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.773921 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:53:30.774261 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.774235 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:53:30.774404 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.774243 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:53:30.775260 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.775244 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:53:30.775351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.775342 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:53:30.775615 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.775595 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:53:30.775679 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.775597 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gwml9\"" Apr 20 14:53:30.775679 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.775595 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:53:30.775793 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.775682 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.780890 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.780860 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:53:30.780981 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.780900 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:53:30.781050 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.781034 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vk54s\"" Apr 20 14:53:30.837875 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.837835 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:48:29 +0000 UTC" deadline="2027-10-22 05:42:57.259608244 +0000 UTC" Apr 20 14:53:30.837875 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.837868 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13190h49m26.421743116s" Apr 20 14:53:30.863679 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.863625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" event={"ID":"c510f351159c758475778f44c8d7da56","Type":"ContainerStarted","Data":"ae254a4b70e38aed87986c2712f766844a25745fae4174f873798dffb8bb0a76"} Apr 20 14:53:30.863841 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.863822 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:53:30.865457 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.865430 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" event={"ID":"351da3143bd95e36e22d1ccb0700c673","Type":"ContainerStarted","Data":"434b45a81a67e4d105e1042a3972d953a33da7346d76dbcefd5ad9dd61638164"} Apr 20 14:53:30.871482 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b2dcedf-7d66-473e-a794-d90c68ccd475-host-slash\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.871587 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-sys\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.871587 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-cnibin\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.871587 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b2dcedf-7d66-473e-a794-d90c68ccd475-host-slash\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.871587 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-systemd-units\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.871734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-socket-dir-parent\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.871734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:30.871734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-sys\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.871734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n79m6\" (UniqueName: \"kubernetes.io/projected/f8c083d6-bb9f-4c22-bcef-e04d04e09740-kube-api-access-n79m6\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.871734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-system-cni-dir\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.871734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-slash\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.871734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-host\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-cni-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkd82\" (UniqueName: \"kubernetes.io/projected/aaf83337-5403-4bd0-b782-5d5fa014368f-kube-api-access-rkd82\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovn-node-metrics-cert\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-k8s-cni-cncf-io\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4236e7c-46e3-443b-9430-39ff80fbd8dc-konnectivity-ca\") pod \"konnectivity-agent-w4psc\" (UID: \"f4236e7c-46e3-443b-9430-39ff80fbd8dc\") " pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-registration-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-host\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.871979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-cni-bin\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-registration-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-cni-multus\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-host\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysconfig\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovnkube-config\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-hostroot\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysconfig\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-daemon-config\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b2dcedf-7d66-473e-a794-d90c68ccd475-iptables-alerter-script\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-kubernetes\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8hvm\" (UniqueName: \"kubernetes.io/projected/0871071e-e935-405e-8b82-b08123f1734d-kube-api-access-l8hvm\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-os-release\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-node-log\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-system-cni-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-kubernetes\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-cnibin\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-conf-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.872466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-modprobe-d\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872444 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4236e7c-46e3-443b-9430-39ff80fbd8dc-konnectivity-ca\") pod \"konnectivity-agent-w4psc\" (UID: \"f4236e7c-46e3-443b-9430-39ff80fbd8dc\") " pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0871071e-e935-405e-8b82-b08123f1734d-tmp\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-var-lib-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-cni-netd\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-socket-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-sys-fs\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysctl-d\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-var-lib-kubelet\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-systemd\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4236e7c-46e3-443b-9430-39ff80fbd8dc-agent-certs\") pod \"konnectivity-agent-w4psc\" (UID: \"f4236e7c-46e3-443b-9430-39ff80fbd8dc\") " pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lj2d\" (UniqueName: \"kubernetes.io/projected/b8808028-95d4-494d-8038-d6152f52c0e3-kube-api-access-7lj2d\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-cni-bin\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-socket-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b2dcedf-7d66-473e-a794-d90c68ccd475-iptables-alerter-script\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqss8\" (UniqueName: \"kubernetes.io/projected/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-kube-api-access-sqss8\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-sys-fs\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872798 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-var-lib-kubelet\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/326469e8-2bee-4754-a084-2cfc2ffe79a2-cni-binary-copy\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-netns\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-kubelet-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.872998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-etc-selinux\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysctl-d\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-modprobe-d\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0871071e-e935-405e-8b82-b08123f1734d-etc-tuned\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-env-overrides\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-kubelet-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-etc-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873133 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-etc-selinux\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-ovn\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovnkube-script-lib\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-kubelet\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvzdt\" (UniqueName: \"kubernetes.io/projected/326469e8-2bee-4754-a084-2cfc2ffe79a2-kube-api-access-dvzdt\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.873942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysctl-conf\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-systemd\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-run\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-lib-modules\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-kubelet\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-systemd\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-os-release\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-run\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-etc-sysctl-conf\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0871071e-e935-405e-8b82-b08123f1734d-lib-modules\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-serviceca\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-multus-certs\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:30.874703 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-device-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-log-socket\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfcc\" (UniqueName: \"kubernetes.io/projected/3b839cc0-9133-43ab-a8ea-a31b28df87b2-kube-api-access-bpfcc\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8c083d6-bb9f-4c22-bcef-e04d04e09740-device-dir\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-etc-kubernetes\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.873697 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zblc8\" (UniqueName: \"kubernetes.io/projected/9b2dcedf-7d66-473e-a794-d90c68ccd475-kube-api-access-zblc8\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.873789 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:53:31.373757212 +0000 UTC m=+3.105075486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:30.875424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.873812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-run-netns\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.876363 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.876306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0871071e-e935-405e-8b82-b08123f1734d-etc-tuned\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.876589 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.876566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4236e7c-46e3-443b-9430-39ff80fbd8dc-agent-certs\") pod \"konnectivity-agent-w4psc\" (UID: \"f4236e7c-46e3-443b-9430-39ff80fbd8dc\") " pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:30.876680 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.876661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0871071e-e935-405e-8b82-b08123f1734d-tmp\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.899948 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.899918 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:30.899948 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.899953 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:30.900147 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.899967 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kx6n6 for pod openshift-network-diagnostics/network-check-target-qmkvh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:30.900147 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:30.900046 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6 podName:8b7bbeab-141a-400c-a72b-4990b648aa44 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:31.400026094 +0000 UTC m=+3.131344363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kx6n6" (UniqueName: "kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6") pod "network-check-target-qmkvh" (UID: "8b7bbeab-141a-400c-a72b-4990b648aa44") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:30.903074 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.902685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8hvm\" (UniqueName: \"kubernetes.io/projected/0871071e-e935-405e-8b82-b08123f1734d-kube-api-access-l8hvm\") pod \"tuned-2mklb\" (UID: \"0871071e-e935-405e-8b82-b08123f1734d\") " pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:30.904420 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.904393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zblc8\" (UniqueName: \"kubernetes.io/projected/9b2dcedf-7d66-473e-a794-d90c68ccd475-kube-api-access-zblc8\") pod \"iptables-alerter-vw8ch\" (UID: \"9b2dcedf-7d66-473e-a794-d90c68ccd475\") " pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:30.905685 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.905658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkd82\" (UniqueName: \"kubernetes.io/projected/aaf83337-5403-4bd0-b782-5d5fa014368f-kube-api-access-rkd82\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:30.909499 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.909479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n79m6\" (UniqueName: \"kubernetes.io/projected/f8c083d6-bb9f-4c22-bcef-e04d04e09740-kube-api-access-n79m6\") pod \"aws-ebs-csi-driver-node-69z69\" (UID: \"f8c083d6-bb9f-4c22-bcef-e04d04e09740\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:30.974759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.974759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-log-socket\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.974759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.974759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfcc\" (UniqueName: \"kubernetes.io/projected/3b839cc0-9133-43ab-a8ea-a31b28df87b2-kube-api-access-bpfcc\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.974759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-etc-kubernetes\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-log-socket\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-run-netns\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-run-netns\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-cnibin\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-systemd-units\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-etc-kubernetes\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.974979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-socket-dir-parent\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-cnibin\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-system-cni-dir\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-system-cni-dir\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-slash\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-host\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-systemd-units\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-cni-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovn-node-metrics-cert\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-socket-dir-parent\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-host\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-k8s-cni-cncf-io\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-slash\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-cni-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-k8s-cni-cncf-io\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.975824 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.975313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-cni-bin\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.976228 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-cni-multus\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.976274 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovnkube-config\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.976274 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-hostroot\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.976368 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-daemon-config\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.976368 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-os-release\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.976447 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-node-log\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.976496 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-node-log\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.976567 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-hostroot\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977042 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovnkube-config\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.977144 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-daemon-config\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977144 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-system-cni-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977144 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-cnibin\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977144 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-conf-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977332 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8808028-95d4-494d-8038-d6152f52c0e3-os-release\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.977332 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-var-lib-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.977332 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.976125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-cni-bin\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977332 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-cni-netd\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.977332 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-system-cni-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977332 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-cni-multus\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977332 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-cni-netd\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.977626 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-var-lib-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.977626 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-multus-conf-dir\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977626 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-cnibin\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.977756 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.977663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.978267 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.978245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovn-node-metrics-cert\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979473 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-systemd\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979578 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lj2d\" (UniqueName: \"kubernetes.io/projected/b8808028-95d4-494d-8038-d6152f52c0e3-kube-api-access-7lj2d\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.979643 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-cni-bin\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979643 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqss8\" (UniqueName: \"kubernetes.io/projected/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-kube-api-access-sqss8\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.979643 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-systemd\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979784 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/326469e8-2bee-4754-a084-2cfc2ffe79a2-cni-binary-copy\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.979784 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-netns\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.979784 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-cni-bin\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979784 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-env-overrides\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979961 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-etc-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979961 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-etc-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979961 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-netns\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.979961 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-ovn\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.979961 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovnkube-script-lib\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980186 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-ovn\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980186 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.979977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-kubelet\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.980186 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-var-lib-kubelet\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.980186 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvzdt\" (UniqueName: \"kubernetes.io/projected/326469e8-2bee-4754-a084-2cfc2ffe79a2-kube-api-access-dvzdt\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.980186 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-kubelet\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980413 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-os-release\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.980413 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-os-release\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.980413 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-env-overrides\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980413 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.980602 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980411 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-host-kubelet\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980602 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.980602 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980602 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-serviceca\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.980781 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-multus-certs\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.980781 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/326469e8-2bee-4754-a084-2cfc2ffe79a2-host-run-multus-certs\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.980781 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b839cc0-9133-43ab-a8ea-a31b28df87b2-run-openvswitch\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980781 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b839cc0-9133-43ab-a8ea-a31b28df87b2-ovnkube-script-lib\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:30.980991 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.981045 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.980984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b8808028-95d4-494d-8038-d6152f52c0e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:30.981107 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.981088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-serviceca\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:30.982215 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.982189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/326469e8-2bee-4754-a084-2cfc2ffe79a2-cni-binary-copy\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:30.997883 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:30.997823 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ht2vg"] Apr 20 14:53:31.001990 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.001960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.002078 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.002032 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:31.003404 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.003381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lj2d\" (UniqueName: \"kubernetes.io/projected/b8808028-95d4-494d-8038-d6152f52c0e3-kube-api-access-7lj2d\") pod \"multus-additional-cni-plugins-wzx48\" (UID: \"b8808028-95d4-494d-8038-d6152f52c0e3\") " pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:31.004998 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.004977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfcc\" (UniqueName: \"kubernetes.io/projected/3b839cc0-9133-43ab-a8ea-a31b28df87b2-kube-api-access-bpfcc\") pod \"ovnkube-node-q8mhl\" (UID: \"3b839cc0-9133-43ab-a8ea-a31b28df87b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:31.010898 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.010880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqss8\" (UniqueName: \"kubernetes.io/projected/007a8d25-f684-41a4-a2f6-d4e2f7bd79d5-kube-api-access-sqss8\") pod \"node-ca-9vx92\" (UID: \"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5\") " pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:31.012533 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.012500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvzdt\" (UniqueName: \"kubernetes.io/projected/326469e8-2bee-4754-a084-2cfc2ffe79a2-kube-api-access-dvzdt\") pod \"multus-59qmp\" (UID: \"326469e8-2bee-4754-a084-2cfc2ffe79a2\") " pod="openshift-multus/multus-59qmp" Apr 20 14:53:31.064813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.064763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vw8ch" Apr 20 14:53:31.074090 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.074063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:31.081257 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.081229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-kubelet-config\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.081350 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.081268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-dbus\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.081405 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.081358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.084305 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.084280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" Apr 20 14:53:31.089997 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.089978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2mklb" Apr 20 14:53:31.094093 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.094072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:31.099641 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.099624 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9vx92" Apr 20 14:53:31.105048 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.105032 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-59qmp" Apr 20 14:53:31.110569 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.110550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wzx48" Apr 20 14:53:31.182477 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.182442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.182672 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.182523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-kubelet-config\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.182672 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.182549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-dbus\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.182672 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.182590 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:31.182672 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.182634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-kubelet-config\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.182672 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.182652 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret podName:2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae nodeName:}" failed. No retries permitted until 2026-04-20 14:53:31.682637104 +0000 UTC m=+3.413955384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret") pod "global-pull-secret-syncer-ht2vg" (UID: "2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:31.182929 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.182730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-dbus\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.214663 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.214629 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:31.384003 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.383920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:31.384161 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.384074 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:31.384161 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.384133 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:53:32.384114603 +0000 UTC m=+4.115432868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:31.485193 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.485164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:31.485375 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.485342 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:31.485375 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.485365 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:31.485471 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.485378 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kx6n6 for pod openshift-network-diagnostics/network-check-target-qmkvh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:31.485471 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.485438 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6 podName:8b7bbeab-141a-400c-a72b-4990b648aa44 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:32.485420881 +0000 UTC m=+4.216739150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kx6n6" (UniqueName: "kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6") pod "network-check-target-qmkvh" (UID: "8b7bbeab-141a-400c-a72b-4990b648aa44") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:31.679401 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:31.679371 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0871071e_e935_405e_8b82_b08123f1734d.slice/crio-26ecab5d15a25115957103b760df0008295c22ecb173fde8b2aacb13fd4211b2 WatchSource:0}: Error finding container 26ecab5d15a25115957103b760df0008295c22ecb173fde8b2aacb13fd4211b2: Status 404 returned error can't find the container with id 26ecab5d15a25115957103b760df0008295c22ecb173fde8b2aacb13fd4211b2 Apr 20 14:53:31.687013 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.686990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:31.687117 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:31.687073 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b839cc0_9133_43ab_a8ea_a31b28df87b2.slice/crio-950d866a3547d9fe2699b0d7d8a0320ee749169b52b7c169052e93a1e3e186de WatchSource:0}: Error finding container 950d866a3547d9fe2699b0d7d8a0320ee749169b52b7c169052e93a1e3e186de: Status 404 returned error can't find the container with id 950d866a3547d9fe2699b0d7d8a0320ee749169b52b7c169052e93a1e3e186de Apr 20 14:53:31.687117 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.687111 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:31.687257 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:31.687176 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret podName:2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae nodeName:}" failed. No retries permitted until 2026-04-20 14:53:32.687156809 +0000 UTC m=+4.418475085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret") pod "global-pull-secret-syncer-ht2vg" (UID: "2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:31.687827 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:31.687722 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8808028_95d4_494d_8038_d6152f52c0e3.slice/crio-977127ab64af138799b1b5b714ddae64ba6917204c6bbfbf41318defcf31f93e WatchSource:0}: Error finding container 977127ab64af138799b1b5b714ddae64ba6917204c6bbfbf41318defcf31f93e: Status 404 returned error can't find the container with id 977127ab64af138799b1b5b714ddae64ba6917204c6bbfbf41318defcf31f93e Apr 20 14:53:31.688724 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:31.688706 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod007a8d25_f684_41a4_a2f6_d4e2f7bd79d5.slice/crio-19d8b0877dcffa60be76e99ca6bf9a045b32b4364d38f27de28ea849df3d7116 WatchSource:0}: Error finding container 19d8b0877dcffa60be76e99ca6bf9a045b32b4364d38f27de28ea849df3d7116: Status 404 returned error can't find the container with id 19d8b0877dcffa60be76e99ca6bf9a045b32b4364d38f27de28ea849df3d7116 Apr 20 14:53:31.689578 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:31.689555 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4236e7c_46e3_443b_9430_39ff80fbd8dc.slice/crio-09a3d764b46b7221aa1ac6bcd65d0f295bfb89b27e3f35128b7441db7a39a585 WatchSource:0}: Error finding container 09a3d764b46b7221aa1ac6bcd65d0f295bfb89b27e3f35128b7441db7a39a585: Status 404 returned error can't find the container with id 09a3d764b46b7221aa1ac6bcd65d0f295bfb89b27e3f35128b7441db7a39a585 Apr 20 14:53:31.690301 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:31.690276 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c083d6_bb9f_4c22_bcef_e04d04e09740.slice/crio-1f7ad80b2c45a8e065163a3e293f04309f922256d4ba0e4aec9e96a1cf593885 WatchSource:0}: Error finding container 1f7ad80b2c45a8e065163a3e293f04309f922256d4ba0e4aec9e96a1cf593885: Status 404 returned error can't find the container with id 1f7ad80b2c45a8e065163a3e293f04309f922256d4ba0e4aec9e96a1cf593885 Apr 20 14:53:31.839151 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.838895 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:48:29 +0000 UTC" deadline="2027-10-30 18:49:34.855515898 +0000 UTC" Apr 20 14:53:31.839151 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.839150 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13395h56m3.016372578s" Apr 20 14:53:31.868789 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.868753 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" event={"ID":"c510f351159c758475778f44c8d7da56","Type":"ContainerStarted","Data":"46d6170f05f76b974c393c86a0c97953bdae4aca6216ea2c905c15909744aa5d"} Apr 20 14:53:31.869837 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.869804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vw8ch" event={"ID":"9b2dcedf-7d66-473e-a794-d90c68ccd475","Type":"ContainerStarted","Data":"a7bc9f8ccb80e74baf6c85dbb36e3629cca0e9220ec22550eac7844b163013e3"} Apr 20 14:53:31.870741 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.870718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerStarted","Data":"977127ab64af138799b1b5b714ddae64ba6917204c6bbfbf41318defcf31f93e"} Apr 20 14:53:31.871685 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.871659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" event={"ID":"f8c083d6-bb9f-4c22-bcef-e04d04e09740","Type":"ContainerStarted","Data":"1f7ad80b2c45a8e065163a3e293f04309f922256d4ba0e4aec9e96a1cf593885"} Apr 20 14:53:31.872606 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.872587 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w4psc" event={"ID":"f4236e7c-46e3-443b-9430-39ff80fbd8dc","Type":"ContainerStarted","Data":"09a3d764b46b7221aa1ac6bcd65d0f295bfb89b27e3f35128b7441db7a39a585"} Apr 20 14:53:31.873486 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.873450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-59qmp" event={"ID":"326469e8-2bee-4754-a084-2cfc2ffe79a2","Type":"ContainerStarted","Data":"7f3f34f16764512e97af19e51589b5fefb0293097ea5f539eb67e7edb2845751"} Apr 20 14:53:31.874387 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.874358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9vx92" event={"ID":"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5","Type":"ContainerStarted","Data":"19d8b0877dcffa60be76e99ca6bf9a045b32b4364d38f27de28ea849df3d7116"} Apr 20 14:53:31.875335 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.875314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"950d866a3547d9fe2699b0d7d8a0320ee749169b52b7c169052e93a1e3e186de"} Apr 20 14:53:31.876219 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:31.876198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2mklb" event={"ID":"0871071e-e935-405e-8b82-b08123f1734d","Type":"ContainerStarted","Data":"26ecab5d15a25115957103b760df0008295c22ecb173fde8b2aacb13fd4211b2"} Apr 20 14:53:32.392050 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.392017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:32.392219 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.392149 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:32.392219 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.392207 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:53:34.392189291 +0000 UTC m=+6.123507561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:32.493179 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.493099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:32.493405 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.493255 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:32.493405 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.493274 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:32.493405 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.493286 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kx6n6 for pod openshift-network-diagnostics/network-check-target-qmkvh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:32.493405 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.493339 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6 podName:8b7bbeab-141a-400c-a72b-4990b648aa44 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:34.493321636 +0000 UTC m=+6.224639902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kx6n6" (UniqueName: "kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6") pod "network-check-target-qmkvh" (UID: "8b7bbeab-141a-400c-a72b-4990b648aa44") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:32.694663 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.694625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:32.694841 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.694766 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:32.694841 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.694820 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret podName:2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae nodeName:}" failed. No retries permitted until 2026-04-20 14:53:34.694801994 +0000 UTC m=+6.426120279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret") pod "global-pull-secret-syncer-ht2vg" (UID: "2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:32.869468 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.868634 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:32.869468 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.868749 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:32.869468 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.869126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:32.869468 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.869221 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:32.869468 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.869297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:32.869468 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:32.869375 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:32.884317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.883851 2575 generic.go:358] "Generic (PLEG): container finished" podID="351da3143bd95e36e22d1ccb0700c673" containerID="1534e40cae8a927ffa7e061e18db32d34ee4bdda4644b039f912d74359cd2002" exitCode=0 Apr 20 14:53:32.884317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.883946 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" event={"ID":"351da3143bd95e36e22d1ccb0700c673","Type":"ContainerDied","Data":"1534e40cae8a927ffa7e061e18db32d34ee4bdda4644b039f912d74359cd2002"} Apr 20 14:53:32.902006 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:32.901947 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-82.ec2.internal" podStartSLOduration=3.9019302590000002 podStartE2EDuration="3.901930259s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:31.886307179 +0000 UTC m=+3.617625457" watchObservedRunningTime="2026-04-20 14:53:32.901930259 +0000 UTC m=+4.633248548" Apr 20 14:53:33.899885 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:33.899850 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" event={"ID":"351da3143bd95e36e22d1ccb0700c673","Type":"ContainerStarted","Data":"9b618ea65d35aa154e0d50334232c015c822a9e67801144ab3cd92e7f52f2fe7"} Apr 20 14:53:34.409828 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:34.409794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:34.410012 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.409954 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:34.410071 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.410022 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:53:38.410003126 +0000 UTC m=+10.141321396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:34.510972 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:34.510935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:34.511167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.511146 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:34.511167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.511165 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:34.511279 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.511177 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kx6n6 for pod openshift-network-diagnostics/network-check-target-qmkvh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:34.511279 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.511240 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6 podName:8b7bbeab-141a-400c-a72b-4990b648aa44 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:38.511221547 +0000 UTC m=+10.242539825 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kx6n6" (UniqueName: "kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6") pod "network-check-target-qmkvh" (UID: "8b7bbeab-141a-400c-a72b-4990b648aa44") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:34.713135 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:34.712555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:34.713135 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.712693 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:34.713135 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.712756 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret podName:2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae nodeName:}" failed. No retries permitted until 2026-04-20 14:53:38.712738185 +0000 UTC m=+10.444056454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret") pod "global-pull-secret-syncer-ht2vg" (UID: "2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:34.859802 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:34.859767 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:34.859958 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.859884 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:34.860285 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:34.860262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:34.860393 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.860371 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:34.860450 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:34.860440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:34.860565 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:34.860536 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:36.863894 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:36.863863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:36.863894 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:36.863893 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:36.864396 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:36.864001 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:36.864396 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:36.864048 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:36.865246 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:36.865206 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:36.865635 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:36.865404 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:38.444349 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.444313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:38.444825 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.444549 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:38.444825 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.444624 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:53:46.444604 +0000 UTC m=+18.175922279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:38.544761 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.544718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:38.544957 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.544896 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:38.544957 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.544919 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:38.544957 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.544932 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kx6n6 for pod openshift-network-diagnostics/network-check-target-qmkvh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:38.545120 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.544988 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6 podName:8b7bbeab-141a-400c-a72b-4990b648aa44 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:46.544970166 +0000 UTC m=+18.276288435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kx6n6" (UniqueName: "kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6") pod "network-check-target-qmkvh" (UID: "8b7bbeab-141a-400c-a72b-4990b648aa44") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:38.746726 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.746618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:38.746890 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.746788 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:38.746890 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.746855 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret podName:2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae nodeName:}" failed. No retries permitted until 2026-04-20 14:53:46.746836282 +0000 UTC m=+18.478154550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret") pod "global-pull-secret-syncer-ht2vg" (UID: "2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:38.826275 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.826233 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-82.ec2.internal" podStartSLOduration=9.826215689 podStartE2EDuration="9.826215689s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:33.915400844 +0000 UTC m=+5.646719158" watchObservedRunningTime="2026-04-20 14:53:38.826215689 +0000 UTC m=+10.557533977" Apr 20 14:53:38.827401 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.826685 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hscb7"] Apr 20 14:53:38.832555 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.832533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:38.835566 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.835545 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nzkkj\"" Apr 20 14:53:38.835816 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.835797 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:53:38.836035 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.836020 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:53:38.863133 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.863108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:38.863315 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.863143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:38.863315 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.863223 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:38.863315 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.863282 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:38.863315 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.863277 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:38.863531 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:38.863360 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:38.948301 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.948262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8c441a8-d683-4309-8a59-c1525285f7e1-hosts-file\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:38.948460 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.948398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvbs7\" (UniqueName: \"kubernetes.io/projected/d8c441a8-d683-4309-8a59-c1525285f7e1-kube-api-access-xvbs7\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:38.948460 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:38.948448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8c441a8-d683-4309-8a59-c1525285f7e1-tmp-dir\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:39.049309 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:39.049224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8c441a8-d683-4309-8a59-c1525285f7e1-hosts-file\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:39.049457 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:39.049307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvbs7\" (UniqueName: \"kubernetes.io/projected/d8c441a8-d683-4309-8a59-c1525285f7e1-kube-api-access-xvbs7\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:39.049457 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:39.049373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8c441a8-d683-4309-8a59-c1525285f7e1-tmp-dir\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:39.049457 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:39.049429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8c441a8-d683-4309-8a59-c1525285f7e1-hosts-file\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:39.049788 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:39.049766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8c441a8-d683-4309-8a59-c1525285f7e1-tmp-dir\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:39.064177 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:39.062269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvbs7\" (UniqueName: \"kubernetes.io/projected/d8c441a8-d683-4309-8a59-c1525285f7e1-kube-api-access-xvbs7\") pod \"node-resolver-hscb7\" (UID: \"d8c441a8-d683-4309-8a59-c1525285f7e1\") " pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:39.142339 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:39.142309 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hscb7" Apr 20 14:53:40.860734 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:40.860697 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:40.861192 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:40.860697 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:40.861192 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:40.860814 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:40.861192 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:40.860918 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:40.861192 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:40.860707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:40.861192 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:40.861032 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:42.860471 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:42.860430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:42.860912 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:42.860430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:42.860912 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:42.860583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:42.860912 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:42.860438 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:42.860912 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:42.860641 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:42.860912 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:42.860709 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:44.860226 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:44.860121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:44.860226 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:44.860161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:44.860226 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:44.860143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:44.860777 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:44.860256 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:44.860777 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:44.860341 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:44.860777 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:44.860441 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:46.503478 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:46.503436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:46.503982 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.503632 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:46.503982 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.503715 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:02.503691411 +0000 UTC m=+34.235009679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:46.604465 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:46.604419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:46.604631 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.604601 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:46.604631 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.604625 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:46.604721 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.604635 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kx6n6 for pod openshift-network-diagnostics/network-check-target-qmkvh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:46.604721 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.604708 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6 podName:8b7bbeab-141a-400c-a72b-4990b648aa44 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:02.604694213 +0000 UTC m=+34.336012478 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kx6n6" (UniqueName: "kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6") pod "network-check-target-qmkvh" (UID: "8b7bbeab-141a-400c-a72b-4990b648aa44") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:46.805751 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:46.805664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:46.805909 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.805815 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:46.805909 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.805876 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret podName:2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae nodeName:}" failed. No retries permitted until 2026-04-20 14:54:02.805860584 +0000 UTC m=+34.537178863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret") pod "global-pull-secret-syncer-ht2vg" (UID: "2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:53:46.860700 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:46.860657 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:46.860899 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.860801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:46.860899 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:46.860847 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:46.860899 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:46.860867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:46.861051 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.860950 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:46.861051 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:46.861033 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:47.795060 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:53:47.795027 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8c441a8_d683_4309_8a59_c1525285f7e1.slice/crio-d476824a777f632d0d0de78573463313d2a458e2d919827b3d6b8ef10059498d WatchSource:0}: Error finding container d476824a777f632d0d0de78573463313d2a458e2d919827b3d6b8ef10059498d: Status 404 returned error can't find the container with id d476824a777f632d0d0de78573463313d2a458e2d919827b3d6b8ef10059498d Apr 20 14:53:47.927028 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:47.926990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hscb7" event={"ID":"d8c441a8-d683-4309-8a59-c1525285f7e1","Type":"ContainerStarted","Data":"d476824a777f632d0d0de78573463313d2a458e2d919827b3d6b8ef10059498d"} Apr 20 14:53:48.860815 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.860560 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:48.861627 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.860577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:48.861627 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:48.860844 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:48.861627 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.860560 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:48.861627 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:48.860951 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:48.861627 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:48.861067 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:48.930523 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.930473 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8808028-95d4-494d-8038-d6152f52c0e3" containerID="ba3da2013e4940e7b7037127647e0e0ab3cc0e9f49b4d6578d3be3b78d7c7cf2" exitCode=0 Apr 20 14:53:48.930693 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.930568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerDied","Data":"ba3da2013e4940e7b7037127647e0e0ab3cc0e9f49b4d6578d3be3b78d7c7cf2"} Apr 20 14:53:48.931794 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.931774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" event={"ID":"f8c083d6-bb9f-4c22-bcef-e04d04e09740","Type":"ContainerStarted","Data":"0bfef1a678d9e28820d5240cd68c9b7f6c87c57f2ea49f4e52190f1fbe843723"} Apr 20 14:53:48.932893 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.932870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w4psc" event={"ID":"f4236e7c-46e3-443b-9430-39ff80fbd8dc","Type":"ContainerStarted","Data":"ec699dfccfd28bf1c4562a20576370a8f839ef8dc9307509c34b56daec94337a"} Apr 20 14:53:48.934005 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.933983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hscb7" event={"ID":"d8c441a8-d683-4309-8a59-c1525285f7e1","Type":"ContainerStarted","Data":"a9bbe0906acebd59f87068f70082ee2955ed26b88c7facdcf9c1d92e1caaae54"} Apr 20 14:53:48.935182 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.935158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-59qmp" event={"ID":"326469e8-2bee-4754-a084-2cfc2ffe79a2","Type":"ContainerStarted","Data":"dea59f0582cff3faf1791d704f4e56c5555d03e18df391ee6da99d0627c05418"} Apr 20 14:53:48.936288 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.936272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9vx92" event={"ID":"007a8d25-f684-41a4-a2f6-d4e2f7bd79d5","Type":"ContainerStarted","Data":"25d37dc3bb327c1ed9111356703bfd1c63364b690890142b7321110838916090"} Apr 20 14:53:48.938410 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 14:53:48.938683 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938667 2575 generic.go:358] "Generic (PLEG): container finished" podID="3b839cc0-9133-43ab-a8ea-a31b28df87b2" containerID="518efe61c43a43c016033aa9dd37cea266c10914a787fa26be07da3a9d009bf7" exitCode=1 Apr 20 14:53:48.938747 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938729 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"53addd014a35f219cc90595b90dbb984eb2721bc8852a0541f922c3697237837"} Apr 20 14:53:48.938803 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938750 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"5019b69491bc9a7807b3416f381a92e71d8f844f5fd4866525cbd881c4de9897"} Apr 20 14:53:48.938803 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"9a27b9dc90234062b3909d24f2b3aaac36e76fcca7129899444d692cd68edec9"} Apr 20 14:53:48.938803 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938776 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"e0cf56ac38a73f2906ce20fb2ee21bbdd53eaee046d1884b7aab33f5dec733ec"} Apr 20 14:53:48.938803 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerDied","Data":"518efe61c43a43c016033aa9dd37cea266c10914a787fa26be07da3a9d009bf7"} Apr 20 14:53:48.938941 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.938803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"788c41d521ec673d18a2bc4c54daeb4b7ef8cac8f39b56040a28ea5907f370f2"} Apr 20 14:53:48.939775 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.939757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2mklb" event={"ID":"0871071e-e935-405e-8b82-b08123f1734d","Type":"ContainerStarted","Data":"44c6e2ae46edd2c1c3556083dc92798445dbcfafb2a261145d750e0f20d53f31"} Apr 20 14:53:48.991738 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.991683 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2mklb" podStartSLOduration=3.845333165 podStartE2EDuration="19.991661637s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.683405264 +0000 UTC m=+3.414723529" lastFinishedPulling="2026-04-20 14:53:47.829733734 +0000 UTC m=+19.561052001" observedRunningTime="2026-04-20 14:53:48.979082748 +0000 UTC m=+20.710401036" watchObservedRunningTime="2026-04-20 14:53:48.991661637 +0000 UTC m=+20.722979925" Apr 20 14:53:48.992203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:48.992169 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9vx92" podStartSLOduration=3.923453446 podStartE2EDuration="19.992161114s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.691013329 +0000 UTC m=+3.422331597" lastFinishedPulling="2026-04-20 14:53:47.759721 +0000 UTC m=+19.491039265" observedRunningTime="2026-04-20 14:53:48.991806503 +0000 UTC m=+20.723124786" watchObservedRunningTime="2026-04-20 14:53:48.992161114 +0000 UTC m=+20.723479401" Apr 20 14:53:49.010330 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.010268 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w4psc" podStartSLOduration=3.941892761 podStartE2EDuration="20.010249614s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.691368537 +0000 UTC m=+3.422686803" lastFinishedPulling="2026-04-20 14:53:47.759725386 +0000 UTC m=+19.491043656" observedRunningTime="2026-04-20 14:53:49.009797667 +0000 UTC m=+20.741115978" watchObservedRunningTime="2026-04-20 14:53:49.010249614 +0000 UTC m=+20.741567902" Apr 20 14:53:49.023988 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.023890 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-59qmp" podStartSLOduration=3.771290159 podStartE2EDuration="20.023869377s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.693169974 +0000 UTC m=+3.424488253" lastFinishedPulling="2026-04-20 14:53:47.945749203 +0000 UTC m=+19.677067471" observedRunningTime="2026-04-20 14:53:49.023865433 +0000 UTC m=+20.755183725" watchObservedRunningTime="2026-04-20 14:53:49.023869377 +0000 UTC m=+20.755187665" Apr 20 14:53:49.042234 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.042169 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hscb7" podStartSLOduration=11.042149105 podStartE2EDuration="11.042149105s" podCreationTimestamp="2026-04-20 14:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:49.04178165 +0000 UTC m=+20.773099937" watchObservedRunningTime="2026-04-20 14:53:49.042149105 +0000 UTC m=+20.773467403" Apr 20 14:53:49.124116 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.124075 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:53:49.827578 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.827455 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:53:49.12409609Z","UUID":"a89c1086-8291-4a08-925c-e7092955bc9f","Handler":null,"Name":"","Endpoint":""} Apr 20 14:53:49.830353 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.830323 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:53:49.830353 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.830361 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:53:49.943897 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.943865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" event={"ID":"f8c083d6-bb9f-4c22-bcef-e04d04e09740","Type":"ContainerStarted","Data":"7af272964ccccf30fd1dcd88187209b2616fab65561244b579252c7482ef60d8"} Apr 20 14:53:49.945131 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.945100 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vw8ch" event={"ID":"9b2dcedf-7d66-473e-a794-d90c68ccd475","Type":"ContainerStarted","Data":"539e55b85b2f3392edcfb62ac7ba4c3aa73646fea7dd2f7f0f774b190bd1b0b5"} Apr 20 14:53:49.959689 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:49.959629 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vw8ch" podStartSLOduration=5.823948932 podStartE2EDuration="21.959608451s" podCreationTimestamp="2026-04-20 14:53:28 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.69371089 +0000 UTC m=+3.425029162" lastFinishedPulling="2026-04-20 14:53:47.829370403 +0000 UTC m=+19.560688681" observedRunningTime="2026-04-20 14:53:49.958950228 +0000 UTC m=+21.690268525" watchObservedRunningTime="2026-04-20 14:53:49.959608451 +0000 UTC m=+21.690926739" Apr 20 14:53:50.860523 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:50.860290 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:50.860729 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:50.860353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:50.860729 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:50.860626 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:50.860729 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:50.860362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:50.860893 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:50.860728 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:50.860893 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:50.860784 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:50.949904 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:50.949826 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 14:53:50.950321 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:50.950172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"580227d1b7b844678d04ba45d8abb89dc86a0fd4ab7f9355e7ad69deed52bf91"} Apr 20 14:53:50.952204 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:50.952168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" event={"ID":"f8c083d6-bb9f-4c22-bcef-e04d04e09740","Type":"ContainerStarted","Data":"17b2a7d8ec4fd2d1d320e4657342ea0064b9410a2e2074bd0acfe3d834c2d2fd"} Apr 20 14:53:50.967153 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:50.967090 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69z69" podStartSLOduration=3.269915128 podStartE2EDuration="21.967070407s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.693506453 +0000 UTC m=+3.424824724" lastFinishedPulling="2026-04-20 14:53:50.390661728 +0000 UTC m=+22.121980003" observedRunningTime="2026-04-20 14:53:50.966688959 +0000 UTC m=+22.698007246" watchObservedRunningTime="2026-04-20 14:53:50.967070407 +0000 UTC m=+22.698388695" Apr 20 14:53:51.928907 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:51.928819 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:51.929415 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:51.929395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:51.954171 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:51.954097 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:51.954786 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:51.954480 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w4psc" Apr 20 14:53:52.859885 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:52.859856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:52.860049 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:52.859855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:52.860049 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:52.859981 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:52.860168 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:52.859856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:52.860168 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:52.860063 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:52.860266 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:52.860176 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:53.961906 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.961737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 14:53:53.962573 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.962198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"b5648767acee2aa51356483ced00b993d1958a67955aff7529b2ff90ba9fad16"} Apr 20 14:53:53.962573 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.962504 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:53.962573 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.962559 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:53.962573 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.962571 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:53.962776 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.962712 2575 scope.go:117] "RemoveContainer" containerID="518efe61c43a43c016033aa9dd37cea266c10914a787fa26be07da3a9d009bf7" Apr 20 14:53:53.964051 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.964028 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8808028-95d4-494d-8038-d6152f52c0e3" containerID="46bba96e243268b6467851a52e7b47936dc6e083e903cca7ccd097f85262892a" exitCode=0 Apr 20 14:53:53.964161 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.964116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerDied","Data":"46bba96e243268b6467851a52e7b47936dc6e083e903cca7ccd097f85262892a"} Apr 20 14:53:53.977611 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.977575 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:53.977729 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:53.977653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:53:54.860736 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:54.860668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:54.860866 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:54.860671 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:54.860903 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:54.860870 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:54.860903 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:54.860771 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:54.860903 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:54.860678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:54.861044 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:54.860979 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:54.968261 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:54.968228 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8808028-95d4-494d-8038-d6152f52c0e3" containerID="ae1dee5fa8656e1ae7f1e5683f346da8bbaba9cc1fdf2d11d53f781779df2e82" exitCode=0 Apr 20 14:53:54.968689 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:54.968298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerDied","Data":"ae1dee5fa8656e1ae7f1e5683f346da8bbaba9cc1fdf2d11d53f781779df2e82"} Apr 20 14:53:54.972134 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:54.972115 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 14:53:54.972447 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:54.972420 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" event={"ID":"3b839cc0-9133-43ab-a8ea-a31b28df87b2","Type":"ContainerStarted","Data":"e048c3535049076237e74a0139ac61c7b27c0fd170e5eb7ac1ff93706e53eca1"} Apr 20 14:53:55.027642 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.027572 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" podStartSLOduration=9.810460814 podStartE2EDuration="26.027555058s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.688980298 +0000 UTC m=+3.420298581" lastFinishedPulling="2026-04-20 14:53:47.906074545 +0000 UTC m=+19.637392825" observedRunningTime="2026-04-20 14:53:55.025889114 +0000 UTC m=+26.757207426" watchObservedRunningTime="2026-04-20 14:53:55.027555058 +0000 UTC m=+26.758873344" Apr 20 14:53:55.223072 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.223043 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ht2vg"] Apr 20 14:53:55.223217 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.223162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:55.223251 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:55.223239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:55.227156 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.227132 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vbpm4"] Apr 20 14:53:55.227272 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.227228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:55.227310 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:55.227298 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:55.231808 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.231789 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qmkvh"] Apr 20 14:53:55.231885 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.231866 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:55.231947 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:55.231928 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:55.976489 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.976457 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8808028-95d4-494d-8038-d6152f52c0e3" containerID="1a38b225dec9c59673f502ddd3636d645933b602a5a1144a8764a271ce3024e4" exitCode=0 Apr 20 14:53:55.976950 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:55.976554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerDied","Data":"1a38b225dec9c59673f502ddd3636d645933b602a5a1144a8764a271ce3024e4"} Apr 20 14:53:56.860622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:56.860367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:56.860789 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:56.860367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:56.860789 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:56.860741 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:56.860789 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:56.860367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:56.860943 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:56.860796 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:56.860943 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:56.860851 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:53:58.860856 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:58.860814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:53:58.861334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:58.860899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:53:58.861334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:53:58.860935 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:53:58.861334 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:58.860954 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:53:58.861334 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:58.861019 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:53:58.861334 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:53:58.861083 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:54:00.860791 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:00.860704 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:54:00.861285 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:00.860705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:54:00.861285 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:00.860826 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ht2vg" podUID="2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae" Apr 20 14:54:00.861285 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:00.860704 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:54:00.861285 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:00.860884 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qmkvh" podUID="8b7bbeab-141a-400c-a72b-4990b648aa44" Apr 20 14:54:00.861285 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:00.861012 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:54:01.092002 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.091966 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-82.ec2.internal" event="NodeReady" Apr 20 14:54:01.092167 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.092128 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:54:01.140534 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.140430 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d4b464978-2whf9"] Apr 20 14:54:01.180141 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.180100 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5flsl"] Apr 20 14:54:01.180366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.180347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.183915 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.183888 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 14:54:01.184152 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.184138 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rrj9j\"" Apr 20 14:54:01.184759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.184736 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 14:54:01.186371 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.186352 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 14:54:01.192150 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.192124 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 14:54:01.195910 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.195887 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t7cf5"] Apr 20 14:54:01.196047 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.196016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:01.201063 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.200786 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 14:54:01.201063 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.200786 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-75gpj\"" Apr 20 14:54:01.201063 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.201050 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 14:54:01.217870 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.217841 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5flsl"] Apr 20 14:54:01.218010 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.217873 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d4b464978-2whf9"] Apr 20 14:54:01.218010 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.217934 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fccb4"] Apr 20 14:54:01.218102 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.218039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.221167 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.221145 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:54:01.223581 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.223560 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:54:01.223718 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.223605 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rc7ph\"" Apr 20 14:54:01.239372 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.239337 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t7cf5"] Apr 20 14:54:01.239372 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.239375 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fccb4"] Apr 20 14:54:01.239588 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.239504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:01.244200 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.244173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:54:01.244376 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.244351 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:54:01.244547 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.244399 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:54:01.244990 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.244971 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jp5zp\"" Apr 20 14:54:01.319084 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-registry-certificates\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.319250 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-installation-pull-secrets\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.319250 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-trusted-ca\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.319250 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grv6l\" (UniqueName: \"kubernetes.io/projected/a808e761-5c95-412e-a362-7e3ffb34caeb-kube-api-access-grv6l\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:01.319250 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-image-registry-private-configuration\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.319250 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3af7863-723b-45a3-8247-7e29b9a9da3c-config-volume\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.319485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6cda8435-e869-40a8-9726-f7b6d4767009-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:01.319485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:01.319485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.319485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:01.319485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89ddc55e-4262-44ca-b737-16453dbd75de-ca-trust-extracted\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.319485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmgdl\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-kube-api-access-nmgdl\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.319760 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlrz\" (UniqueName: \"kubernetes.io/projected/b3af7863-723b-45a3-8247-7e29b9a9da3c-kube-api-access-7rlrz\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.319760 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3af7863-723b-45a3-8247-7e29b9a9da3c-tmp-dir\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.319760 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.319760 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.319662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-bound-sa-token\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.420926 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-registry-certificates\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.420926 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-installation-pull-secrets\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.420926 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-trusted-ca\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.420926 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grv6l\" (UniqueName: \"kubernetes.io/projected/a808e761-5c95-412e-a362-7e3ffb34caeb-kube-api-access-grv6l\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:01.421244 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-image-registry-private-configuration\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.421244 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3af7863-723b-45a3-8247-7e29b9a9da3c-config-volume\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.421244 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6cda8435-e869-40a8-9726-f7b6d4767009-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:01.421244 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.420994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:01.421244 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.421467 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:01.421467 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89ddc55e-4262-44ca-b737-16453dbd75de-ca-trust-extracted\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.421467 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmgdl\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-kube-api-access-nmgdl\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.421467 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlrz\" (UniqueName: \"kubernetes.io/projected/b3af7863-723b-45a3-8247-7e29b9a9da3c-kube-api-access-7rlrz\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.421467 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3af7863-723b-45a3-8247-7e29b9a9da3c-tmp-dir\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.421467 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.421467 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-bound-sa-token\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3af7863-723b-45a3-8247-7e29b9a9da3c-config-volume\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-registry-certificates\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.421706 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.421722 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.421771 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:54:01.92175506 +0000 UTC m=+33.653073338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3af7863-723b-45a3-8247-7e29b9a9da3c-tmp-dir\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.421801 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:54:01.421811 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.421785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-trusted-ca\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.422148 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.421846 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:01.422148 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.421865 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:01.921848146 +0000 UTC m=+33.653166422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:54:01.422148 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.421890 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:01.921879188 +0000 UTC m=+33.653197460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:54:01.422148 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.422036 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:01.422148 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.422035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89ddc55e-4262-44ca-b737-16453dbd75de-ca-trust-extracted\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.422148 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.422088 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:54:01.922073988 +0000 UTC m=+33.653392261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:54:01.425286 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.425257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-image-registry-private-configuration\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.425286 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.425272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-installation-pull-secrets\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.430587 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.430565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6cda8435-e869-40a8-9726-f7b6d4767009-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:01.436482 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.436462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grv6l\" (UniqueName: \"kubernetes.io/projected/a808e761-5c95-412e-a362-7e3ffb34caeb-kube-api-access-grv6l\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:01.439348 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.439321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmgdl\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-kube-api-access-nmgdl\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.439739 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.439714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlrz\" (UniqueName: \"kubernetes.io/projected/b3af7863-723b-45a3-8247-7e29b9a9da3c-kube-api-access-7rlrz\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.440202 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.440185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-bound-sa-token\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.925591 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.925500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:01.925591 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.925589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.925607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.925635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925647 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925666 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925719 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:54:02.92570154 +0000 UTC m=+34.657019810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925721 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925731 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925769 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:02.92575451 +0000 UTC m=+34.657072781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925785 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:02.925777898 +0000 UTC m=+34.657096163 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925797 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:01.926177 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:01.925850 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:54:02.925832081 +0000 UTC m=+34.657150360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:54:01.991737 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:01.991703 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerStarted","Data":"03819fc0bd62c5c82bc539828872509c25f2789dca07ade5a3f3b0a82252dd7a"} Apr 20 14:54:02.528661 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.528618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:54:02.528843 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.528734 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:02.528843 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.528783 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:54:34.528771186 +0000 UTC m=+66.260089451 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:02.629990 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.629947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:54:02.630140 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.630109 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:54:02.630140 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.630126 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:54:02.630140 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.630135 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kx6n6 for pod openshift-network-diagnostics/network-check-target-qmkvh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:02.630234 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.630187 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6 podName:8b7bbeab-141a-400c-a72b-4990b648aa44 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:34.630171811 +0000 UTC m=+66.361490089 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kx6n6" (UniqueName: "kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6") pod "network-check-target-qmkvh" (UID: "8b7bbeab-141a-400c-a72b-4990b648aa44") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:02.831665 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.831627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:54:02.831947 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.831800 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:54:02.831947 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.831884 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret podName:2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae nodeName:}" failed. No retries permitted until 2026-04-20 14:54:34.831862914 +0000 UTC m=+66.563181182 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret") pod "global-pull-secret-syncer-ht2vg" (UID: "2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:54:02.862889 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.862857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:54:02.863089 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.862857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:54:02.863089 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.862857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:54:02.865820 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.865800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:54:02.865939 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.865867 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:54:02.865939 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.865924 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:54:02.866077 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.866062 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-np6cl\"" Apr 20 14:54:02.866617 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.866595 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ns2f2\"" Apr 20 14:54:02.866617 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.866614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:54:02.932925 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.932896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:02.932925 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.932931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.932961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.933035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933045 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933158 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933171 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933046 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933171 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.933150496 +0000 UTC m=+36.664468762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933076 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933220 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.933209349 +0000 UTC m=+36.664527618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933238 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.93322882 +0000 UTC m=+36.664547086 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:54:02.933465 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:02.933258 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:04.933249204 +0000 UTC m=+36.664567474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:54:02.996099 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.996063 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8808028-95d4-494d-8038-d6152f52c0e3" containerID="03819fc0bd62c5c82bc539828872509c25f2789dca07ade5a3f3b0a82252dd7a" exitCode=0 Apr 20 14:54:02.996275 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:02.996105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerDied","Data":"03819fc0bd62c5c82bc539828872509c25f2789dca07ade5a3f3b0a82252dd7a"} Apr 20 14:54:04.001473 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:04.001438 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8808028-95d4-494d-8038-d6152f52c0e3" containerID="b219c181445c7fe70de79b7b3377c5351c19e9a0416fc9313e6bb845d4e49b98" exitCode=0 Apr 20 14:54:04.002015 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:04.001483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerDied","Data":"b219c181445c7fe70de79b7b3377c5351c19e9a0416fc9313e6bb845d4e49b98"} Apr 20 14:54:04.950017 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:04.949788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:04.950017 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.949960 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:04.950017 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:04.949973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:04.950017 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:04.950009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:04.950017 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950022 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:54:08.95000707 +0000 UTC m=+40.681325335 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950074 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:04.950079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950112 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:08.950101459 +0000 UTC m=+40.681419724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950119 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950141 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950149 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950168 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:08.950155312 +0000 UTC m=+40.681473579 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:54:04.950333 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:04.950185 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:54:08.950176961 +0000 UTC m=+40.681495228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:54:05.006593 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:05.006563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzx48" event={"ID":"b8808028-95d4-494d-8038-d6152f52c0e3","Type":"ContainerStarted","Data":"eacfa3953bf2ae9c22aa55f8d6b0ee8f71903a7d0272cbb1a51c8ff7f38fadb9"} Apr 20 14:54:05.032638 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:05.032585 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wzx48" podStartSLOduration=5.927987898 podStartE2EDuration="36.032569555s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:53:31.689338274 +0000 UTC m=+3.420656542" lastFinishedPulling="2026-04-20 14:54:01.793919922 +0000 UTC m=+33.525238199" observedRunningTime="2026-04-20 14:54:05.032365274 +0000 UTC m=+36.763683562" watchObservedRunningTime="2026-04-20 14:54:05.032569555 +0000 UTC m=+36.763887841" Apr 20 14:54:08.981976 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:08.981922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:08.982010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:08.982029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:08.982058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982063 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982082 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982128 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:54:16.982114295 +0000 UTC m=+48.713432560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982154 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982156 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982169 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982209 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:16.98219307 +0000 UTC m=+48.713511339 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982234 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:54:16.98221803 +0000 UTC m=+48.713536307 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:54:08.982438 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:08.982254 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:16.98224449 +0000 UTC m=+48.713562759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:54:17.043632 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:17.043594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:17.043632 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:17.043631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:17.043661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:17.043698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043752 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043803 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043813 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043817 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:54:33.043801374 +0000 UTC m=+64.775119643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043838 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043853 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:54:33.043838029 +0000 UTC m=+64.775156308 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043752 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043883 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:54:33.04387612 +0000 UTC m=+64.775194385 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:54:17.044167 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:17.043894 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:33.043888409 +0000 UTC m=+64.775206674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:54:25.986960 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:25.986931 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8mhl" Apr 20 14:54:33.066438 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:33.066402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:33.066471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:33.066492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:33.066539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066552 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066568 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066618 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:55:05.066603232 +0000 UTC m=+96.797921497 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066623 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066677 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:55:05.066666405 +0000 UTC m=+96.797984674 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066626 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066713 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:55:05.066706123 +0000 UTC m=+96.798024388 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066630 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:54:33.066913 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:33.066737 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:05.066732757 +0000 UTC m=+96.798051022 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:54:34.577914 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.577878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:54:34.580405 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.580389 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:54:34.588341 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:34.588323 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:54:34.588396 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:54:34.588381 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:55:38.588361805 +0000 UTC m=+130.319680070 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : secret "metrics-daemon-secret" not found Apr 20 14:54:34.678473 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.678445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:54:34.681211 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.681195 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:54:34.691066 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.691052 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:54:34.703232 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.703212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6n6\" (UniqueName: \"kubernetes.io/projected/8b7bbeab-141a-400c-a72b-4990b648aa44-kube-api-access-kx6n6\") pod \"network-check-target-qmkvh\" (UID: \"8b7bbeab-141a-400c-a72b-4990b648aa44\") " pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:54:34.880555 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.880465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:54:34.882995 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.882980 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:54:34.892635 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.892611 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae-original-pull-secret\") pod \"global-pull-secret-syncer-ht2vg\" (UID: \"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae\") " pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:54:34.974104 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.974078 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-np6cl\"" Apr 20 14:54:34.982337 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.982318 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:54:34.982400 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:34.982323 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ht2vg" Apr 20 14:54:35.159217 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:35.159149 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ht2vg"] Apr 20 14:54:35.162105 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:35.162080 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qmkvh"] Apr 20 14:54:35.167209 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:54:35.167182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7bbeab_141a_400c_a72b_4990b648aa44.slice/crio-dcb6b335002a249b6c7f118ab9d38b1ed6955b33c88d45d1b9930b3f3b3f12c4 WatchSource:0}: Error finding container dcb6b335002a249b6c7f118ab9d38b1ed6955b33c88d45d1b9930b3f3b3f12c4: Status 404 returned error can't find the container with id dcb6b335002a249b6c7f118ab9d38b1ed6955b33c88d45d1b9930b3f3b3f12c4 Apr 20 14:54:35.167562 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:54:35.167542 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d3ee4f7_b2d5_4a5c_985e_0cac25e122ae.slice/crio-c5fcf27556cf6561b517ba885363bb091a43d4e1f093d73a6401b38287e1e09c WatchSource:0}: Error finding container c5fcf27556cf6561b517ba885363bb091a43d4e1f093d73a6401b38287e1e09c: Status 404 returned error can't find the container with id c5fcf27556cf6561b517ba885363bb091a43d4e1f093d73a6401b38287e1e09c Apr 20 14:54:36.065756 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:36.065704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ht2vg" event={"ID":"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae","Type":"ContainerStarted","Data":"c5fcf27556cf6561b517ba885363bb091a43d4e1f093d73a6401b38287e1e09c"} Apr 20 14:54:36.066959 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:36.066915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qmkvh" event={"ID":"8b7bbeab-141a-400c-a72b-4990b648aa44","Type":"ContainerStarted","Data":"dcb6b335002a249b6c7f118ab9d38b1ed6955b33c88d45d1b9930b3f3b3f12c4"} Apr 20 14:54:40.076235 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:40.076197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ht2vg" event={"ID":"2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae","Type":"ContainerStarted","Data":"cf3516529ecd915f6f4b8f06199cf8603511e23fff12bf1a13490a2e5ff47015"} Apr 20 14:54:40.077397 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:40.077374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qmkvh" event={"ID":"8b7bbeab-141a-400c-a72b-4990b648aa44","Type":"ContainerStarted","Data":"38c0cfbe7d5e3a415ed1ecb92aa16cc2ac17257a742802a9ae10b190949c5c1c"} Apr 20 14:54:40.077536 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:40.077524 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:54:40.089664 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:40.089623 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ht2vg" podStartSLOduration=65.88564197 podStartE2EDuration="1m10.089610431s" podCreationTimestamp="2026-04-20 14:53:30 +0000 UTC" firstStartedPulling="2026-04-20 14:54:35.168988764 +0000 UTC m=+66.900307030" lastFinishedPulling="2026-04-20 14:54:39.372957217 +0000 UTC m=+71.104275491" observedRunningTime="2026-04-20 14:54:40.089294501 +0000 UTC m=+71.820612814" watchObservedRunningTime="2026-04-20 14:54:40.089610431 +0000 UTC m=+71.820928716" Apr 20 14:54:40.101980 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:54:40.101933 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qmkvh" podStartSLOduration=66.902046769 podStartE2EDuration="1m11.101918474s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:54:35.168961087 +0000 UTC m=+66.900279352" lastFinishedPulling="2026-04-20 14:54:39.368832783 +0000 UTC m=+71.100151057" observedRunningTime="2026-04-20 14:54:40.101643719 +0000 UTC m=+71.832962008" watchObservedRunningTime="2026-04-20 14:54:40.101918474 +0000 UTC m=+71.833236760" Apr 20 14:55:05.111939 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:05.111895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:05.111938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:05.112014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112069 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112110 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112141 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:05.112070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112155 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112156 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:56:09.112134374 +0000 UTC m=+160.843452657 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112186 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112203 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:56:09.112189097 +0000 UTC m=+160.843507362 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112223 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:56:09.112216979 +0000 UTC m=+160.843535244 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:55:05.112361 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:05.112260 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:09.112254628 +0000 UTC m=+160.843572894 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:55:11.082704 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:11.082676 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qmkvh" Apr 20 14:55:38.660752 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:38.660705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:55:38.661183 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:38.660873 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:55:38.661183 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:38.660941 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs podName:aaf83337-5403-4bd0-b782-5d5fa014368f nodeName:}" failed. No retries permitted until 2026-04-20 14:57:40.660922376 +0000 UTC m=+252.392240655 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs") pod "network-metrics-daemon-vbpm4" (UID: "aaf83337-5403-4bd0-b782-5d5fa014368f") : secret "metrics-daemon-secret" not found Apr 20 14:55:54.076284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.076248 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x"] Apr 20 14:55:54.079234 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.079214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:54.081489 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.081464 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 14:55:54.081628 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.081526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:54.081628 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.081526 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fnbxh\"" Apr 20 14:55:54.082477 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.082464 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:54.086679 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.086631 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x"] Apr 20 14:55:54.171777 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.171734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:54.171952 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.171790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf924\" (UniqueName: \"kubernetes.io/projected/0709e543-4782-4ec6-a5b6-bf69ac9c6834-kube-api-access-nf924\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:54.272600 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.272564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:54.272600 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.272610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf924\" (UniqueName: \"kubernetes.io/projected/0709e543-4782-4ec6-a5b6-bf69ac9c6834-kube-api-access-nf924\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:54.272818 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:54.272728 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:55:54.272818 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:54.272792 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls podName:0709e543-4782-4ec6-a5b6-bf69ac9c6834 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:54.772775304 +0000 UTC m=+146.504093580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rgm6x" (UID: "0709e543-4782-4ec6-a5b6-bf69ac9c6834") : secret "samples-operator-tls" not found Apr 20 14:55:54.283440 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.283394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf924\" (UniqueName: \"kubernetes.io/projected/0709e543-4782-4ec6-a5b6-bf69ac9c6834-kube-api-access-nf924\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:54.776736 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:54.776696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:54.776945 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:54.776871 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:55:54.777009 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:54.776955 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls podName:0709e543-4782-4ec6-a5b6-bf69ac9c6834 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:55.776930817 +0000 UTC m=+147.508249082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rgm6x" (UID: "0709e543-4782-4ec6-a5b6-bf69ac9c6834") : secret "samples-operator-tls" not found Apr 20 14:55:55.294211 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.294178 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hscb7_d8c441a8-d683-4309-8a59-c1525285f7e1/dns-node-resolver/0.log" Apr 20 14:55:55.694154 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.694075 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9vx92_007a8d25-f684-41a4-a2f6-d4e2f7bd79d5/node-ca/0.log" Apr 20 14:55:55.783562 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.783499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:55.783725 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:55.783644 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:55:55.783725 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:55.783719 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls podName:0709e543-4782-4ec6-a5b6-bf69ac9c6834 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.783702874 +0000 UTC m=+149.515021139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rgm6x" (UID: "0709e543-4782-4ec6-a5b6-bf69ac9c6834") : secret "samples-operator-tls" not found Apr 20 14:55:55.978708 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.978678 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-b8vvq"] Apr 20 14:55:55.981622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.981602 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:55.983831 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.983805 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 14:55:55.984004 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.983838 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:55.984813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.984787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-slr45\"" Apr 20 14:55:55.985040 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.984794 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:55.985182 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.984851 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 14:55:55.988842 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.988770 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 14:55:55.991152 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:55.991127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-b8vvq"] Apr 20 14:55:56.086106 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.086066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b8c0ca-6a14-4aa3-b779-8722694554e7-config\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.086106 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.086114 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b8c0ca-6a14-4aa3-b779-8722694554e7-serving-cert\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.086317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.086159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b8c0ca-6a14-4aa3-b779-8722694554e7-trusted-ca\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.086317 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.086231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxn8g\" (UniqueName: \"kubernetes.io/projected/a3b8c0ca-6a14-4aa3-b779-8722694554e7-kube-api-access-mxn8g\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.187500 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.187461 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b8c0ca-6a14-4aa3-b779-8722694554e7-serving-cert\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.187500 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.187503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b8c0ca-6a14-4aa3-b779-8722694554e7-trusted-ca\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.187743 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.187584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxn8g\" (UniqueName: \"kubernetes.io/projected/a3b8c0ca-6a14-4aa3-b779-8722694554e7-kube-api-access-mxn8g\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.187743 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.187672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b8c0ca-6a14-4aa3-b779-8722694554e7-config\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.188165 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.188144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b8c0ca-6a14-4aa3-b779-8722694554e7-config\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.188351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.188330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3b8c0ca-6a14-4aa3-b779-8722694554e7-trusted-ca\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.189670 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.189650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3b8c0ca-6a14-4aa3-b779-8722694554e7-serving-cert\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.195177 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.195150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxn8g\" (UniqueName: \"kubernetes.io/projected/a3b8c0ca-6a14-4aa3-b779-8722694554e7-kube-api-access-mxn8g\") pod \"console-operator-9d4b6777b-b8vvq\" (UID: \"a3b8c0ca-6a14-4aa3-b779-8722694554e7\") " pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.292077 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.291972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:55:56.410158 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:56.410123 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-b8vvq"] Apr 20 14:55:56.415041 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:55:56.415013 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b8c0ca_6a14_4aa3_b779_8722694554e7.slice/crio-ba687d3dbfbf14cdb5f0fa36ec2dfa76322dafd0831850fb96fa51ec033c599b WatchSource:0}: Error finding container ba687d3dbfbf14cdb5f0fa36ec2dfa76322dafd0831850fb96fa51ec033c599b: Status 404 returned error can't find the container with id ba687d3dbfbf14cdb5f0fa36ec2dfa76322dafd0831850fb96fa51ec033c599b Apr 20 14:55:57.229233 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:57.229197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" event={"ID":"a3b8c0ca-6a14-4aa3-b779-8722694554e7","Type":"ContainerStarted","Data":"ba687d3dbfbf14cdb5f0fa36ec2dfa76322dafd0831850fb96fa51ec033c599b"} Apr 20 14:55:57.801495 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:57.801441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:55:57.801903 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:57.801611 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:55:57.801903 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:55:57.801677 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls podName:0709e543-4782-4ec6-a5b6-bf69ac9c6834 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:01.801660287 +0000 UTC m=+153.532978569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rgm6x" (UID: "0709e543-4782-4ec6-a5b6-bf69ac9c6834") : secret "samples-operator-tls" not found Apr 20 14:55:58.980900 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.980867 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw"] Apr 20 14:55:58.983830 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.983807 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:58.989545 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.987014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-spb42\"" Apr 20 14:55:58.989545 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.987183 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 14:55:58.989545 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.987014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 14:55:58.989545 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.987526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:58.989545 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.987541 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:58.992366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:58.992337 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw"] Apr 20 14:55:59.079868 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.079786 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x"] Apr 20 14:55:59.082784 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.082763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.085197 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.085171 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:59.085197 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.085191 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 14:55:59.085378 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.085170 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:59.085378 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.085210 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 14:55:59.085378 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.085246 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-5qxhg\"" Apr 20 14:55:59.090636 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.090602 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x"] Apr 20 14:55:59.113149 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.113094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fce9aead-ae79-449d-9e77-55a7a14471b5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.113353 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.113222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fce9aead-ae79-449d-9e77-55a7a14471b5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.113353 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.113266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9l7\" (UniqueName: \"kubernetes.io/projected/fce9aead-ae79-449d-9e77-55a7a14471b5-kube-api-access-wp9l7\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.214033 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.213988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9l7\" (UniqueName: \"kubernetes.io/projected/fce9aead-ae79-449d-9e77-55a7a14471b5-kube-api-access-wp9l7\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.214168 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.214050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fce9aead-ae79-449d-9e77-55a7a14471b5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.214168 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.214072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5a93abc-e707-4adb-9942-2ed22b758d32-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.214413 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.214191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a93abc-e707-4adb-9942-2ed22b758d32-config\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.214413 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.214218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrl79\" (UniqueName: \"kubernetes.io/projected/f5a93abc-e707-4adb-9942-2ed22b758d32-kube-api-access-xrl79\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.214413 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.214279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fce9aead-ae79-449d-9e77-55a7a14471b5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.214586 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.214568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fce9aead-ae79-449d-9e77-55a7a14471b5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.216420 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.216393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fce9aead-ae79-449d-9e77-55a7a14471b5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.222074 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.222033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9l7\" (UniqueName: \"kubernetes.io/projected/fce9aead-ae79-449d-9e77-55a7a14471b5-kube-api-access-wp9l7\") pod \"kube-storage-version-migrator-operator-6769c5d45-8l5tw\" (UID: \"fce9aead-ae79-449d-9e77-55a7a14471b5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.233808 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.233782 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/0.log" Apr 20 14:55:59.233992 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.233819 2575 generic.go:358] "Generic (PLEG): container finished" podID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" containerID="e282fe5253637c47d2512261afaebf2a61338ce336b73930f9974b78d1e12e79" exitCode=255 Apr 20 14:55:59.233992 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.233851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" event={"ID":"a3b8c0ca-6a14-4aa3-b779-8722694554e7","Type":"ContainerDied","Data":"e282fe5253637c47d2512261afaebf2a61338ce336b73930f9974b78d1e12e79"} Apr 20 14:55:59.234150 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.234134 2575 scope.go:117] "RemoveContainer" containerID="e282fe5253637c47d2512261afaebf2a61338ce336b73930f9974b78d1e12e79" Apr 20 14:55:59.297766 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.297737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" Apr 20 14:55:59.314697 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.314657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5a93abc-e707-4adb-9942-2ed22b758d32-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.314897 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.314713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a93abc-e707-4adb-9942-2ed22b758d32-config\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.314897 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.314755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrl79\" (UniqueName: \"kubernetes.io/projected/f5a93abc-e707-4adb-9942-2ed22b758d32-kube-api-access-xrl79\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.315370 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.315335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a93abc-e707-4adb-9942-2ed22b758d32-config\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.317387 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.317357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5a93abc-e707-4adb-9942-2ed22b758d32-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.322322 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.322296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrl79\" (UniqueName: \"kubernetes.io/projected/f5a93abc-e707-4adb-9942-2ed22b758d32-kube-api-access-xrl79\") pod \"service-ca-operator-d6fc45fc5-9qx5x\" (UID: \"f5a93abc-e707-4adb-9942-2ed22b758d32\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.392031 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.392001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" Apr 20 14:55:59.417055 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.417019 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw"] Apr 20 14:55:59.421608 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:55:59.421578 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfce9aead_ae79_449d_9e77_55a7a14471b5.slice/crio-610db5cdee2d37aa1116d0afdae1f94c324e854fd6c5b1084d76906eca30c051 WatchSource:0}: Error finding container 610db5cdee2d37aa1116d0afdae1f94c324e854fd6c5b1084d76906eca30c051: Status 404 returned error can't find the container with id 610db5cdee2d37aa1116d0afdae1f94c324e854fd6c5b1084d76906eca30c051 Apr 20 14:55:59.509291 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:55:59.509252 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x"] Apr 20 14:55:59.513924 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:55:59.513893 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a93abc_e707_4adb_9942_2ed22b758d32.slice/crio-3b84e7bdcff937a715cb3684e016889af7a6e4d405f84841798eedddbb097825 WatchSource:0}: Error finding container 3b84e7bdcff937a715cb3684e016889af7a6e4d405f84841798eedddbb097825: Status 404 returned error can't find the container with id 3b84e7bdcff937a715cb3684e016889af7a6e4d405f84841798eedddbb097825 Apr 20 14:56:00.237465 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.237425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" event={"ID":"fce9aead-ae79-449d-9e77-55a7a14471b5","Type":"ContainerStarted","Data":"610db5cdee2d37aa1116d0afdae1f94c324e854fd6c5b1084d76906eca30c051"} Apr 20 14:56:00.238725 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.238683 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" event={"ID":"f5a93abc-e707-4adb-9942-2ed22b758d32","Type":"ContainerStarted","Data":"3b84e7bdcff937a715cb3684e016889af7a6e4d405f84841798eedddbb097825"} Apr 20 14:56:00.240282 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.240257 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/1.log" Apr 20 14:56:00.240745 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.240713 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/0.log" Apr 20 14:56:00.240847 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.240757 2575 generic.go:358] "Generic (PLEG): container finished" podID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" containerID="be9853b083ce414306c593f81edd6863c9dd5bbbdb592c5435259087a292bd32" exitCode=255 Apr 20 14:56:00.240847 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.240812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" event={"ID":"a3b8c0ca-6a14-4aa3-b779-8722694554e7","Type":"ContainerDied","Data":"be9853b083ce414306c593f81edd6863c9dd5bbbdb592c5435259087a292bd32"} Apr 20 14:56:00.240847 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.240843 2575 scope.go:117] "RemoveContainer" containerID="e282fe5253637c47d2512261afaebf2a61338ce336b73930f9974b78d1e12e79" Apr 20 14:56:00.241174 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:00.241154 2575 scope.go:117] "RemoveContainer" containerID="be9853b083ce414306c593f81edd6863c9dd5bbbdb592c5435259087a292bd32" Apr 20 14:56:00.241402 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:00.241383 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-b8vvq_openshift-console-operator(a3b8c0ca-6a14-4aa3-b779-8722694554e7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" podUID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" Apr 20 14:56:01.243868 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:01.243833 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/1.log" Apr 20 14:56:01.244280 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:01.244234 2575 scope.go:117] "RemoveContainer" containerID="be9853b083ce414306c593f81edd6863c9dd5bbbdb592c5435259087a292bd32" Apr 20 14:56:01.244434 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:01.244415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-b8vvq_openshift-console-operator(a3b8c0ca-6a14-4aa3-b779-8722694554e7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" podUID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" Apr 20 14:56:01.835663 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:01.835557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:56:01.835815 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:01.835694 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:56:01.835815 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:01.835764 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls podName:0709e543-4782-4ec6-a5b6-bf69ac9c6834 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:09.835748398 +0000 UTC m=+161.567066664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rgm6x" (UID: "0709e543-4782-4ec6-a5b6-bf69ac9c6834") : secret "samples-operator-tls" not found Apr 20 14:56:02.248059 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:02.248019 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" event={"ID":"fce9aead-ae79-449d-9e77-55a7a14471b5","Type":"ContainerStarted","Data":"27d325333982c7eca8eeac3b203db6a2f14bbb740c559feca90691abcea834a3"} Apr 20 14:56:02.249362 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:02.249332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" event={"ID":"f5a93abc-e707-4adb-9942-2ed22b758d32","Type":"ContainerStarted","Data":"348d5004028c8794adaafb23528856aaa08e30851ae9ad2d92c3b1093350f5ab"} Apr 20 14:56:02.263613 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:02.263564 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" podStartSLOduration=2.227489176 podStartE2EDuration="4.263550037s" podCreationTimestamp="2026-04-20 14:55:58 +0000 UTC" firstStartedPulling="2026-04-20 14:55:59.423310648 +0000 UTC m=+151.154628914" lastFinishedPulling="2026-04-20 14:56:01.459371506 +0000 UTC m=+153.190689775" observedRunningTime="2026-04-20 14:56:02.263398247 +0000 UTC m=+153.994716535" watchObservedRunningTime="2026-04-20 14:56:02.263550037 +0000 UTC m=+153.994868324" Apr 20 14:56:02.280014 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:02.279896 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" podStartSLOduration=1.333244018 podStartE2EDuration="3.279876668s" podCreationTimestamp="2026-04-20 14:55:59 +0000 UTC" firstStartedPulling="2026-04-20 14:55:59.515690611 +0000 UTC m=+151.247008876" lastFinishedPulling="2026-04-20 14:56:01.462323256 +0000 UTC m=+153.193641526" observedRunningTime="2026-04-20 14:56:02.279825346 +0000 UTC m=+154.011143637" watchObservedRunningTime="2026-04-20 14:56:02.279876668 +0000 UTC m=+154.011194957" Apr 20 14:56:04.192777 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:04.192725 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5d4b464978-2whf9" podUID="89ddc55e-4262-44ca-b737-16453dbd75de" Apr 20 14:56:04.207092 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:04.207053 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" podUID="6cda8435-e869-40a8-9726-f7b6d4767009" Apr 20 14:56:04.228378 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:04.228345 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t7cf5" podUID="b3af7863-723b-45a3-8247-7e29b9a9da3c" Apr 20 14:56:04.249663 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:04.249614 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fccb4" podUID="a808e761-5c95-412e-a362-7e3ffb34caeb" Apr 20 14:56:04.253382 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:04.253363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t7cf5" Apr 20 14:56:04.253461 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:04.253388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:56:04.253502 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:04.253498 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:56:05.198808 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.198775 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jp8d8"] Apr 20 14:56:05.201944 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.201927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.204990 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.204968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 14:56:05.205813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.205792 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 14:56:05.205813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.205806 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qf562\"" Apr 20 14:56:05.205963 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.205791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 14:56:05.205963 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.205791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 14:56:05.209620 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.209599 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jp8d8"] Apr 20 14:56:05.262777 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.262740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8e61bc43-47e7-4954-9b73-b2d8b881209b-signing-key\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.262957 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.262883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbd9h\" (UniqueName: \"kubernetes.io/projected/8e61bc43-47e7-4954-9b73-b2d8b881209b-kube-api-access-qbd9h\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.262957 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.262924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8e61bc43-47e7-4954-9b73-b2d8b881209b-signing-cabundle\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.363375 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.363334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8e61bc43-47e7-4954-9b73-b2d8b881209b-signing-cabundle\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.363582 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.363401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8e61bc43-47e7-4954-9b73-b2d8b881209b-signing-key\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.363582 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.363538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbd9h\" (UniqueName: \"kubernetes.io/projected/8e61bc43-47e7-4954-9b73-b2d8b881209b-kube-api-access-qbd9h\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.364050 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.364016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8e61bc43-47e7-4954-9b73-b2d8b881209b-signing-cabundle\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.365848 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.365827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8e61bc43-47e7-4954-9b73-b2d8b881209b-signing-key\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.370951 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.370928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbd9h\" (UniqueName: \"kubernetes.io/projected/8e61bc43-47e7-4954-9b73-b2d8b881209b-kube-api-access-qbd9h\") pod \"service-ca-865cb79987-jp8d8\" (UID: \"8e61bc43-47e7-4954-9b73-b2d8b881209b\") " pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.511071 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.511038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jp8d8" Apr 20 14:56:05.625760 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:05.625721 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jp8d8"] Apr 20 14:56:05.629683 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:05.629655 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e61bc43_47e7_4954_9b73_b2d8b881209b.slice/crio-0248f5e19fefd761a06212f2c5218da5f79b93000ee28c049cd1fe8c709316e9 WatchSource:0}: Error finding container 0248f5e19fefd761a06212f2c5218da5f79b93000ee28c049cd1fe8c709316e9: Status 404 returned error can't find the container with id 0248f5e19fefd761a06212f2c5218da5f79b93000ee28c049cd1fe8c709316e9 Apr 20 14:56:05.878249 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:05.878153 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-vbpm4" podUID="aaf83337-5403-4bd0-b782-5d5fa014368f" Apr 20 14:56:06.262908 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:06.262867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jp8d8" event={"ID":"8e61bc43-47e7-4954-9b73-b2d8b881209b","Type":"ContainerStarted","Data":"f7c19041c7d4f2df90107c49b79eaedcbf99004fd7e558b4120d98e1e0ef68b4"} Apr 20 14:56:06.263343 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:06.262915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jp8d8" event={"ID":"8e61bc43-47e7-4954-9b73-b2d8b881209b","Type":"ContainerStarted","Data":"0248f5e19fefd761a06212f2c5218da5f79b93000ee28c049cd1fe8c709316e9"} Apr 20 14:56:06.283731 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:06.283681 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jp8d8" podStartSLOduration=1.283663164 podStartE2EDuration="1.283663164s" podCreationTimestamp="2026-04-20 14:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:06.282660519 +0000 UTC m=+158.013978808" watchObservedRunningTime="2026-04-20 14:56:06.283663164 +0000 UTC m=+158.014981452" Apr 20 14:56:06.292795 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:06.292765 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:56:06.292886 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:06.292815 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:56:06.293267 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:06.293254 2575 scope.go:117] "RemoveContainer" containerID="be9853b083ce414306c593f81edd6863c9dd5bbbdb592c5435259087a292bd32" Apr 20 14:56:06.293480 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:06.293460 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-b8vvq_openshift-console-operator(a3b8c0ca-6a14-4aa3-b779-8722694554e7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" podUID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" Apr 20 14:56:09.196955 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:09.196908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:09.196973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") pod \"image-registry-5d4b464978-2whf9\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:09.197020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:09.197060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197062 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197113 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197134 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d4b464978-2whf9: secret "image-registry-tls" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197163 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls podName:b3af7863-723b-45a3-8247-7e29b9a9da3c nodeName:}" failed. No retries permitted until 2026-04-20 14:58:11.197144615 +0000 UTC m=+282.928462897 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls") pod "dns-default-t7cf5" (UID: "b3af7863-723b-45a3-8247-7e29b9a9da3c") : secret "dns-default-metrics-tls" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197196 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls podName:89ddc55e-4262-44ca-b737-16453dbd75de nodeName:}" failed. No retries permitted until 2026-04-20 14:58:11.19718269 +0000 UTC m=+282.928500956 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls") pod "image-registry-5d4b464978-2whf9" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de") : secret "image-registry-tls" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197116 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197119 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197222 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert podName:a808e761-5c95-412e-a362-7e3ffb34caeb nodeName:}" failed. No retries permitted until 2026-04-20 14:58:11.197216475 +0000 UTC m=+282.928534741 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert") pod "ingress-canary-fccb4" (UID: "a808e761-5c95-412e-a362-7e3ffb34caeb") : secret "canary-serving-cert" not found Apr 20 14:56:09.197487 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:09.197278 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert podName:6cda8435-e869-40a8-9726-f7b6d4767009 nodeName:}" failed. No retries permitted until 2026-04-20 14:58:11.197266289 +0000 UTC m=+282.928584554 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5flsl" (UID: "6cda8435-e869-40a8-9726-f7b6d4767009") : secret "networking-console-plugin-cert" not found Apr 20 14:56:09.904203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:09.904164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:56:09.906571 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:09.906539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0709e543-4782-4ec6-a5b6-bf69ac9c6834-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rgm6x\" (UID: \"0709e543-4782-4ec6-a5b6-bf69ac9c6834\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:56:09.989971 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:09.989930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" Apr 20 14:56:10.119225 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:10.119183 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x"] Apr 20 14:56:10.273868 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:10.273826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" event={"ID":"0709e543-4782-4ec6-a5b6-bf69ac9c6834","Type":"ContainerStarted","Data":"92ac88d396c512753d36e6286026daa8be73633a64638192b70075aa256de141"} Apr 20 14:56:12.280135 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:12.280107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" event={"ID":"0709e543-4782-4ec6-a5b6-bf69ac9c6834","Type":"ContainerStarted","Data":"b9ba90c77930bbd69b4616c403573a55e4f6696574c10cc7194ba8896cfd4794"} Apr 20 14:56:13.284415 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:13.284373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" event={"ID":"0709e543-4782-4ec6-a5b6-bf69ac9c6834","Type":"ContainerStarted","Data":"50f71e8a2dc2878999373bc95926d65f2bce908226dd0ef7a790a7b196c27c0b"} Apr 20 14:56:13.299813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:13.299763 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rgm6x" podStartSLOduration=17.294906762 podStartE2EDuration="19.299744363s" podCreationTimestamp="2026-04-20 14:55:54 +0000 UTC" firstStartedPulling="2026-04-20 14:56:10.162494524 +0000 UTC m=+161.893812791" lastFinishedPulling="2026-04-20 14:56:12.167332121 +0000 UTC m=+163.898650392" observedRunningTime="2026-04-20 14:56:13.298582281 +0000 UTC m=+165.029900569" watchObservedRunningTime="2026-04-20 14:56:13.299744363 +0000 UTC m=+165.031062654" Apr 20 14:56:17.860020 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:17.859936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:56:17.860020 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:17.859944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:56:17.860448 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:17.860198 2575 scope.go:117] "RemoveContainer" containerID="be9853b083ce414306c593f81edd6863c9dd5bbbdb592c5435259087a292bd32" Apr 20 14:56:18.298144 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:18.298117 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 14:56:18.298491 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:18.298475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/1.log" Apr 20 14:56:18.298558 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:18.298526 2575 generic.go:358] "Generic (PLEG): container finished" podID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" containerID="b7c36b2583a8df9b07b0f7df60ae1c7eb0c6950238acdc900687a6955ca17ebf" exitCode=255 Apr 20 14:56:18.298597 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:18.298561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" event={"ID":"a3b8c0ca-6a14-4aa3-b779-8722694554e7","Type":"ContainerDied","Data":"b7c36b2583a8df9b07b0f7df60ae1c7eb0c6950238acdc900687a6955ca17ebf"} Apr 20 14:56:18.298597 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:18.298588 2575 scope.go:117] "RemoveContainer" containerID="be9853b083ce414306c593f81edd6863c9dd5bbbdb592c5435259087a292bd32" Apr 20 14:56:18.298906 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:18.298884 2575 scope.go:117] "RemoveContainer" containerID="b7c36b2583a8df9b07b0f7df60ae1c7eb0c6950238acdc900687a6955ca17ebf" Apr 20 14:56:18.299090 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:18.299071 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-b8vvq_openshift-console-operator(a3b8c0ca-6a14-4aa3-b779-8722694554e7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" podUID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" Apr 20 14:56:19.302157 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:19.302131 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 14:56:25.153272 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.153241 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xf6wz"] Apr 20 14:56:25.157205 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.157183 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.159640 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.159614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:56:25.159760 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.159630 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:56:25.160693 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.160675 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:56:25.160782 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.160710 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:56:25.160782 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.160766 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-z4lp2\"" Apr 20 14:56:25.166849 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.166826 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xf6wz"] Apr 20 14:56:25.249885 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.249845 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6"] Apr 20 14:56:25.252878 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.252855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" Apr 20 14:56:25.255174 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.255150 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 14:56:25.255300 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.255201 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mlvl6\"" Apr 20 14:56:25.261606 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.261585 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6"] Apr 20 14:56:25.331019 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.330985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx8nt\" (UniqueName: \"kubernetes.io/projected/c79acd36-8478-42d1-bf37-0e5b738f4737-kube-api-access-dx8nt\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.331019 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.331020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c79acd36-8478-42d1-bf37-0e5b738f4737-crio-socket\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.331278 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.331067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c79acd36-8478-42d1-bf37-0e5b738f4737-data-volume\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.331278 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.331143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c79acd36-8478-42d1-bf37-0e5b738f4737-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.331278 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.331211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c79acd36-8478-42d1-bf37-0e5b738f4737-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.431847 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.431763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c79acd36-8478-42d1-bf37-0e5b738f4737-data-volume\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.431847 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.431803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c79acd36-8478-42d1-bf37-0e5b738f4737-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.432072 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.431862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8d600620-fcd1-47ac-884e-38d6ff2fb62c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pgpn6\" (UID: \"8d600620-fcd1-47ac-884e-38d6ff2fb62c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" Apr 20 14:56:25.432072 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.431895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c79acd36-8478-42d1-bf37-0e5b738f4737-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.432072 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.432009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx8nt\" (UniqueName: \"kubernetes.io/projected/c79acd36-8478-42d1-bf37-0e5b738f4737-kube-api-access-dx8nt\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.432072 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.432048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c79acd36-8478-42d1-bf37-0e5b738f4737-crio-socket\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.432241 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.432108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c79acd36-8478-42d1-bf37-0e5b738f4737-data-volume\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.432241 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.432136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c79acd36-8478-42d1-bf37-0e5b738f4737-crio-socket\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.432336 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.432319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c79acd36-8478-42d1-bf37-0e5b738f4737-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.434215 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.434198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c79acd36-8478-42d1-bf37-0e5b738f4737-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.442953 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.442929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx8nt\" (UniqueName: \"kubernetes.io/projected/c79acd36-8478-42d1-bf37-0e5b738f4737-kube-api-access-dx8nt\") pod \"insights-runtime-extractor-xf6wz\" (UID: \"c79acd36-8478-42d1-bf37-0e5b738f4737\") " pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.466232 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.466195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xf6wz" Apr 20 14:56:25.532898 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.532856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8d600620-fcd1-47ac-884e-38d6ff2fb62c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pgpn6\" (UID: \"8d600620-fcd1-47ac-884e-38d6ff2fb62c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" Apr 20 14:56:25.535146 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.535117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8d600620-fcd1-47ac-884e-38d6ff2fb62c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pgpn6\" (UID: \"8d600620-fcd1-47ac-884e-38d6ff2fb62c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" Apr 20 14:56:25.562126 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.562097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" Apr 20 14:56:25.588134 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.588098 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xf6wz"] Apr 20 14:56:25.591051 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:25.591020 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79acd36_8478_42d1_bf37_0e5b738f4737.slice/crio-4646c16a9be438610687c40665eaf0d9db8ccdd0d40359326b5d8479d99e0207 WatchSource:0}: Error finding container 4646c16a9be438610687c40665eaf0d9db8ccdd0d40359326b5d8479d99e0207: Status 404 returned error can't find the container with id 4646c16a9be438610687c40665eaf0d9db8ccdd0d40359326b5d8479d99e0207 Apr 20 14:56:25.685709 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:25.685633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6"] Apr 20 14:56:25.689451 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:25.689420 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d600620_fcd1_47ac_884e_38d6ff2fb62c.slice/crio-197534764a2c53b86935e246a94bbee06ae84de26768b97983fb0a9d3e8fcf44 WatchSource:0}: Error finding container 197534764a2c53b86935e246a94bbee06ae84de26768b97983fb0a9d3e8fcf44: Status 404 returned error can't find the container with id 197534764a2c53b86935e246a94bbee06ae84de26768b97983fb0a9d3e8fcf44 Apr 20 14:56:26.292529 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:26.292496 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:56:26.292915 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:26.292539 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:56:26.292972 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:26.292943 2575 scope.go:117] "RemoveContainer" containerID="b7c36b2583a8df9b07b0f7df60ae1c7eb0c6950238acdc900687a6955ca17ebf" Apr 20 14:56:26.293149 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:26.293129 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-b8vvq_openshift-console-operator(a3b8c0ca-6a14-4aa3-b779-8722694554e7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" podUID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" Apr 20 14:56:26.319688 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:26.319654 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" event={"ID":"8d600620-fcd1-47ac-884e-38d6ff2fb62c","Type":"ContainerStarted","Data":"197534764a2c53b86935e246a94bbee06ae84de26768b97983fb0a9d3e8fcf44"} Apr 20 14:56:26.320967 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:26.320937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf6wz" event={"ID":"c79acd36-8478-42d1-bf37-0e5b738f4737","Type":"ContainerStarted","Data":"87f253dffafb40dd48975c9545dac10d0167469cf29652920d7e0eb4e516df67"} Apr 20 14:56:26.320967 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:26.320966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf6wz" event={"ID":"c79acd36-8478-42d1-bf37-0e5b738f4737","Type":"ContainerStarted","Data":"4646c16a9be438610687c40665eaf0d9db8ccdd0d40359326b5d8479d99e0207"} Apr 20 14:56:27.324848 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.324806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" event={"ID":"8d600620-fcd1-47ac-884e-38d6ff2fb62c","Type":"ContainerStarted","Data":"d80cc265468a848cbffcf9ad7e0d41d5c1b95ea89729c28256d10d1e1202e461"} Apr 20 14:56:27.325320 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.324995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" Apr 20 14:56:27.326676 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.326647 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf6wz" event={"ID":"c79acd36-8478-42d1-bf37-0e5b738f4737","Type":"ContainerStarted","Data":"28a16f972323f18b2d73bd9c295c71f9a0dc45768cac1f502b1c33bc3bde5c20"} Apr 20 14:56:27.330866 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.330842 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" Apr 20 14:56:27.339488 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.339443 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pgpn6" podStartSLOduration=1.266714579 podStartE2EDuration="2.339431457s" podCreationTimestamp="2026-04-20 14:56:25 +0000 UTC" firstStartedPulling="2026-04-20 14:56:25.691813972 +0000 UTC m=+177.423132237" lastFinishedPulling="2026-04-20 14:56:26.764530846 +0000 UTC m=+178.495849115" observedRunningTime="2026-04-20 14:56:27.338562786 +0000 UTC m=+179.069881075" watchObservedRunningTime="2026-04-20 14:56:27.339431457 +0000 UTC m=+179.070749745" Apr 20 14:56:27.504574 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.504540 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zjqc7"] Apr 20 14:56:27.508870 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.508843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.511495 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.511441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-z56t5\"" Apr 20 14:56:27.511622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.511524 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:56:27.511622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.511543 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:56:27.511622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.511442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 14:56:27.511622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.511544 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:56:27.511622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.511556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 14:56:27.518284 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.518245 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zjqc7"] Apr 20 14:56:27.653852 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.653775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af3e061e-0ffd-4228-88bb-e228561992bd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.653986 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.653875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzz7l\" (UniqueName: \"kubernetes.io/projected/af3e061e-0ffd-4228-88bb-e228561992bd-kube-api-access-jzz7l\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.653986 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.653915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.654048 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.654008 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.754404 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.754378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.754504 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.754430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af3e061e-0ffd-4228-88bb-e228561992bd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.754504 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.754477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzz7l\" (UniqueName: \"kubernetes.io/projected/af3e061e-0ffd-4228-88bb-e228561992bd-kube-api-access-jzz7l\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.754642 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.754502 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.754642 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:27.754573 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 14:56:27.754753 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:27.754642 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-tls podName:af3e061e-0ffd-4228-88bb-e228561992bd nodeName:}" failed. No retries permitted until 2026-04-20 14:56:28.254621283 +0000 UTC m=+179.985939572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-zjqc7" (UID: "af3e061e-0ffd-4228-88bb-e228561992bd") : secret "prometheus-operator-tls" not found Apr 20 14:56:27.755158 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.755142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af3e061e-0ffd-4228-88bb-e228561992bd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.756933 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.756908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:27.763074 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:27.763051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzz7l\" (UniqueName: \"kubernetes.io/projected/af3e061e-0ffd-4228-88bb-e228561992bd-kube-api-access-jzz7l\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:28.258326 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:28.258291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:28.260613 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:28.260586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/af3e061e-0ffd-4228-88bb-e228561992bd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjqc7\" (UID: \"af3e061e-0ffd-4228-88bb-e228561992bd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:28.333415 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:28.333379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf6wz" event={"ID":"c79acd36-8478-42d1-bf37-0e5b738f4737","Type":"ContainerStarted","Data":"8444779e84c398a4adc507c59e4a9defe75822f6f090194f359bb9a9167883fb"} Apr 20 14:56:28.349949 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:28.349900 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xf6wz" podStartSLOduration=1.3036635300000001 podStartE2EDuration="3.349884798s" podCreationTimestamp="2026-04-20 14:56:25 +0000 UTC" firstStartedPulling="2026-04-20 14:56:25.656952362 +0000 UTC m=+177.388270644" lastFinishedPulling="2026-04-20 14:56:27.703173647 +0000 UTC m=+179.434491912" observedRunningTime="2026-04-20 14:56:28.349133297 +0000 UTC m=+180.080451587" watchObservedRunningTime="2026-04-20 14:56:28.349884798 +0000 UTC m=+180.081203085" Apr 20 14:56:28.419680 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:28.419643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" Apr 20 14:56:28.533290 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:28.533203 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zjqc7"] Apr 20 14:56:28.536794 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:28.536769 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf3e061e_0ffd_4228_88bb_e228561992bd.slice/crio-770ef11afd6c3d3ee4a523649f17ee78087e9bd2a7abbe36c18a08e1d4751a21 WatchSource:0}: Error finding container 770ef11afd6c3d3ee4a523649f17ee78087e9bd2a7abbe36c18a08e1d4751a21: Status 404 returned error can't find the container with id 770ef11afd6c3d3ee4a523649f17ee78087e9bd2a7abbe36c18a08e1d4751a21 Apr 20 14:56:29.337083 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:29.337040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" event={"ID":"af3e061e-0ffd-4228-88bb-e228561992bd","Type":"ContainerStarted","Data":"770ef11afd6c3d3ee4a523649f17ee78087e9bd2a7abbe36c18a08e1d4751a21"} Apr 20 14:56:30.340707 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:30.340672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" event={"ID":"af3e061e-0ffd-4228-88bb-e228561992bd","Type":"ContainerStarted","Data":"bf4ce8c3e3f977956c804195129247d7120e49137b7facd8e01c82dcfcc35f55"} Apr 20 14:56:30.340707 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:30.340712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" event={"ID":"af3e061e-0ffd-4228-88bb-e228561992bd","Type":"ContainerStarted","Data":"33144669b9707d39681e338d968a596fce24d99ea4f01309fdb68434908694b4"} Apr 20 14:56:30.356889 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:30.356840 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjqc7" podStartSLOduration=2.019802738 podStartE2EDuration="3.356823372s" podCreationTimestamp="2026-04-20 14:56:27 +0000 UTC" firstStartedPulling="2026-04-20 14:56:28.538626716 +0000 UTC m=+180.269944982" lastFinishedPulling="2026-04-20 14:56:29.875647347 +0000 UTC m=+181.606965616" observedRunningTime="2026-04-20 14:56:30.355602125 +0000 UTC m=+182.086920413" watchObservedRunningTime="2026-04-20 14:56:30.356823372 +0000 UTC m=+182.088141660" Apr 20 14:56:31.839051 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.839009 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww"] Apr 20 14:56:31.842661 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.842637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.845685 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.845662 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 14:56:31.845827 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.845764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-l27jv\"" Apr 20 14:56:31.846046 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.846013 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 14:56:31.848551 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.848528 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n6qq5"] Apr 20 14:56:31.855108 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.852875 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww"] Apr 20 14:56:31.855108 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.853005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.855108 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.854384 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mk8bk"] Apr 20 14:56:31.856202 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.856181 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2xwxs\"" Apr 20 14:56:31.858927 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.858911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:56:31.863313 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.859461 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:56:31.863541 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.860062 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:56:31.863942 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.863923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.868567 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.867943 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 14:56:31.868567 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.868023 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 14:56:31.868567 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.868249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-mcm9c\"" Apr 20 14:56:31.868567 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.868375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 14:56:31.873872 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.873270 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mk8bk"] Apr 20 14:56:31.887551 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2df217-e552-4eed-993d-b467cccf24b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.887658 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.887698 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-root\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.887742 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c974df5b-afcc-4232-9913-36d4d36cd14b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.887776 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.887826 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-tls\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.887893 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c974df5b-afcc-4232-9913-36d4d36cd14b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.887948 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-wtmp\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.888037 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-accelerators-collector-config\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.888037 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.887995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.888037 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.888189 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.888189 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxbb\" (UniqueName: \"kubernetes.io/projected/fe2df217-e552-4eed-993d-b467cccf24b4-kube-api-access-kdxbb\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.888189 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.888336 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-sys\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.888336 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cde96ba-1beb-4dd0-91b3-6c4339468969-metrics-client-ca\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.888336 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qttrs\" (UniqueName: \"kubernetes.io/projected/7cde96ba-1beb-4dd0-91b3-6c4339468969-kube-api-access-qttrs\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.888336 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndr7w\" (UniqueName: \"kubernetes.io/projected/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-api-access-ndr7w\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.888336 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.888313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-textfile\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.989527 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndr7w\" (UniqueName: \"kubernetes.io/projected/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-api-access-ndr7w\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.989527 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989530 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-textfile\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.989786 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2df217-e552-4eed-993d-b467cccf24b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.989847 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.989903 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-root\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.989903 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c974df5b-afcc-4232-9913-36d4d36cd14b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.989903 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-textfile\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990047 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.990047 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.989934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-tls\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990047 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c974df5b-afcc-4232-9913-36d4d36cd14b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.990047 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-wtmp\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990233 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-accelerators-collector-config\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990233 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.990233 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.990233 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990426 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxbb\" (UniqueName: \"kubernetes.io/projected/fe2df217-e552-4eed-993d-b467cccf24b4-kube-api-access-kdxbb\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.990426 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.990426 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-sys\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990426 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cde96ba-1beb-4dd0-91b3-6c4339468969-metrics-client-ca\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990426 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qttrs\" (UniqueName: \"kubernetes.io/projected/7cde96ba-1beb-4dd0-91b3-6c4339468969-kube-api-access-qttrs\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990707 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2df217-e552-4eed-993d-b467cccf24b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.990707 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c974df5b-afcc-4232-9913-36d4d36cd14b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.990707 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.990693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-root\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.990844 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:31.990752 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 14:56:31.990844 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:31.990781 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 14:56:31.990844 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:31.990813 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls podName:fe2df217-e552-4eed-993d-b467cccf24b4 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:32.490793704 +0000 UTC m=+184.222111983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-cq7ww" (UID: "fe2df217-e552-4eed-993d-b467cccf24b4") : secret "openshift-state-metrics-tls" not found Apr 20 14:56:31.990844 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:31.990842 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-tls podName:c974df5b-afcc-4232-9913-36d4d36cd14b nodeName:}" failed. No retries permitted until 2026-04-20 14:56:32.490825739 +0000 UTC m=+184.222144008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-mk8bk" (UID: "c974df5b-afcc-4232-9913-36d4d36cd14b") : secret "kube-state-metrics-tls" not found Apr 20 14:56:31.991187 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.991086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-sys\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.991249 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.991181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-wtmp\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.991420 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.991399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.991603 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.991581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cde96ba-1beb-4dd0-91b3-6c4339468969-metrics-client-ca\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.991900 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.991880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c974df5b-afcc-4232-9913-36d4d36cd14b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.991900 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.991892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-accelerators-collector-config\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.992086 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:31.992001 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 14:56:31.992086 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:31.992077 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-tls podName:7cde96ba-1beb-4dd0-91b3-6c4339468969 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:32.492059188 +0000 UTC m=+184.223377472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-tls") pod "node-exporter-n6qq5" (UID: "7cde96ba-1beb-4dd0-91b3-6c4339468969") : secret "node-exporter-tls" not found Apr 20 14:56:31.994177 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.994155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.994297 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.994255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:31.995046 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.995023 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.997618 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.997568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndr7w\" (UniqueName: \"kubernetes.io/projected/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-api-access-ndr7w\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:31.998324 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.998297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxbb\" (UniqueName: \"kubernetes.io/projected/fe2df217-e552-4eed-993d-b467cccf24b4-kube-api-access-kdxbb\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:31.998637 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:31.998621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qttrs\" (UniqueName: \"kubernetes.io/projected/7cde96ba-1beb-4dd0-91b3-6c4339468969-kube-api-access-qttrs\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:32.494878 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.494840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-tls\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:32.495090 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.494914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:32.495090 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.494951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:32.495090 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:32.495073 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 14:56:32.495242 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:32.495200 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls podName:fe2df217-e552-4eed-993d-b467cccf24b4 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:33.495177433 +0000 UTC m=+185.226495703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-cq7ww" (UID: "fe2df217-e552-4eed-993d-b467cccf24b4") : secret "openshift-state-metrics-tls" not found Apr 20 14:56:32.497850 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.497818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7cde96ba-1beb-4dd0-91b3-6c4339468969-node-exporter-tls\") pod \"node-exporter-n6qq5\" (UID: \"7cde96ba-1beb-4dd0-91b3-6c4339468969\") " pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:32.497989 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.497965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c974df5b-afcc-4232-9913-36d4d36cd14b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mk8bk\" (UID: \"c974df5b-afcc-4232-9913-36d4d36cd14b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:32.776011 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.775923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n6qq5" Apr 20 14:56:32.783840 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.783810 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" Apr 20 14:56:32.785284 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:32.785257 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cde96ba_1beb_4dd0_91b3_6c4339468969.slice/crio-e049b6a36d6ebaa2c41bac6018f70f0517eb82b1e256229e3faf7cc41276cd57 WatchSource:0}: Error finding container e049b6a36d6ebaa2c41bac6018f70f0517eb82b1e256229e3faf7cc41276cd57: Status 404 returned error can't find the container with id e049b6a36d6ebaa2c41bac6018f70f0517eb82b1e256229e3faf7cc41276cd57 Apr 20 14:56:32.914552 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:32.914507 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mk8bk"] Apr 20 14:56:32.917488 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:32.917461 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc974df5b_afcc_4232_9913_36d4d36cd14b.slice/crio-517a6c6269115000e1d21299149bfa8feb5011142adfe32bf199eb862be2d5f1 WatchSource:0}: Error finding container 517a6c6269115000e1d21299149bfa8feb5011142adfe32bf199eb862be2d5f1: Status 404 returned error can't find the container with id 517a6c6269115000e1d21299149bfa8feb5011142adfe32bf199eb862be2d5f1 Apr 20 14:56:33.350626 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:33.350586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" event={"ID":"c974df5b-afcc-4232-9913-36d4d36cd14b","Type":"ContainerStarted","Data":"517a6c6269115000e1d21299149bfa8feb5011142adfe32bf199eb862be2d5f1"} Apr 20 14:56:33.352134 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:33.352103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n6qq5" event={"ID":"7cde96ba-1beb-4dd0-91b3-6c4339468969","Type":"ContainerStarted","Data":"e049b6a36d6ebaa2c41bac6018f70f0517eb82b1e256229e3faf7cc41276cd57"} Apr 20 14:56:33.504465 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:33.504415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:33.507679 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:33.507637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe2df217-e552-4eed-993d-b467cccf24b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cq7ww\" (UID: \"fe2df217-e552-4eed-993d-b467cccf24b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:33.654787 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:33.654661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" Apr 20 14:56:33.804878 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:33.804820 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww"] Apr 20 14:56:33.809396 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:33.809365 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe2df217_e552_4eed_993d_b467cccf24b4.slice/crio-1465526831de08be8de54cecdcd3f514fdf368c70c003086d32c419a7798e42a WatchSource:0}: Error finding container 1465526831de08be8de54cecdcd3f514fdf368c70c003086d32c419a7798e42a: Status 404 returned error can't find the container with id 1465526831de08be8de54cecdcd3f514fdf368c70c003086d32c419a7798e42a Apr 20 14:56:34.358526 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:34.358464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" event={"ID":"c974df5b-afcc-4232-9913-36d4d36cd14b","Type":"ContainerStarted","Data":"7873a87e255b976bbe902cf7774a8de1a4f8ad7ca8f138a8f15295050d195672"} Apr 20 14:56:34.358914 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:34.358506 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" event={"ID":"c974df5b-afcc-4232-9913-36d4d36cd14b","Type":"ContainerStarted","Data":"e805d256cd6e990f698f50518e02aa836403002138fd06a18cda9bf88f8e8bb2"} Apr 20 14:56:34.360168 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:34.360032 2575 generic.go:358] "Generic (PLEG): container finished" podID="7cde96ba-1beb-4dd0-91b3-6c4339468969" containerID="f9d621549788752e91d77c2be6ac930c5884db3d9945d490e5cd6fc8935d521c" exitCode=0 Apr 20 14:56:34.360168 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:34.360112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n6qq5" event={"ID":"7cde96ba-1beb-4dd0-91b3-6c4339468969","Type":"ContainerDied","Data":"f9d621549788752e91d77c2be6ac930c5884db3d9945d490e5cd6fc8935d521c"} Apr 20 14:56:34.362964 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:34.362709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" event={"ID":"fe2df217-e552-4eed-993d-b467cccf24b4","Type":"ContainerStarted","Data":"07a9f558e4d55d50bc765873bd221ebc34116aa50cb10a10f21ee4e11784eb6f"} Apr 20 14:56:34.362964 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:34.362745 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" event={"ID":"fe2df217-e552-4eed-993d-b467cccf24b4","Type":"ContainerStarted","Data":"1319bf3643c9d067077992d616a685fb20acc1974840c5d9883a809deb332689"} Apr 20 14:56:34.362964 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:34.362758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" event={"ID":"fe2df217-e552-4eed-993d-b467cccf24b4","Type":"ContainerStarted","Data":"1465526831de08be8de54cecdcd3f514fdf368c70c003086d32c419a7798e42a"} Apr 20 14:56:35.367437 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:35.367369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" event={"ID":"c974df5b-afcc-4232-9913-36d4d36cd14b","Type":"ContainerStarted","Data":"62b904f968bec42eabd5fe57d39989670f553b149582d2185773a6a76665ce28"} Apr 20 14:56:35.369256 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:35.369232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n6qq5" event={"ID":"7cde96ba-1beb-4dd0-91b3-6c4339468969","Type":"ContainerStarted","Data":"bf5ec9d521e1c26243c698e9fab7e27fb31cd9cb1ab88610e188722f82fed824"} Apr 20 14:56:35.369256 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:35.369257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n6qq5" event={"ID":"7cde96ba-1beb-4dd0-91b3-6c4339468969","Type":"ContainerStarted","Data":"46386aac5e4090b32a4274a41c9b8fcb3752035df15a2e4044524c9952f38183"} Apr 20 14:56:35.370916 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:35.370892 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" event={"ID":"fe2df217-e552-4eed-993d-b467cccf24b4","Type":"ContainerStarted","Data":"ef7119729ba4797309fa7bb65f049d803324677f740f91e87688689f0ec17823"} Apr 20 14:56:35.386475 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:35.386437 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-mk8bk" podStartSLOduration=3.117984108 podStartE2EDuration="4.386426025s" podCreationTimestamp="2026-04-20 14:56:31 +0000 UTC" firstStartedPulling="2026-04-20 14:56:32.919425446 +0000 UTC m=+184.650743711" lastFinishedPulling="2026-04-20 14:56:34.187867359 +0000 UTC m=+185.919185628" observedRunningTime="2026-04-20 14:56:35.384739821 +0000 UTC m=+187.116058108" watchObservedRunningTime="2026-04-20 14:56:35.386426025 +0000 UTC m=+187.117744312" Apr 20 14:56:35.400507 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:35.400468 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n6qq5" podStartSLOduration=3.563857617 podStartE2EDuration="4.400455052s" podCreationTimestamp="2026-04-20 14:56:31 +0000 UTC" firstStartedPulling="2026-04-20 14:56:32.789039306 +0000 UTC m=+184.520357585" lastFinishedPulling="2026-04-20 14:56:33.625636749 +0000 UTC m=+185.356955020" observedRunningTime="2026-04-20 14:56:35.399674564 +0000 UTC m=+187.130992857" watchObservedRunningTime="2026-04-20 14:56:35.400455052 +0000 UTC m=+187.131773331" Apr 20 14:56:35.415117 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:35.415076 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cq7ww" podStartSLOduration=3.344155384 podStartE2EDuration="4.415064396s" podCreationTimestamp="2026-04-20 14:56:31 +0000 UTC" firstStartedPulling="2026-04-20 14:56:34.191528865 +0000 UTC m=+185.922847132" lastFinishedPulling="2026-04-20 14:56:35.262437866 +0000 UTC m=+186.993756144" observedRunningTime="2026-04-20 14:56:35.414413468 +0000 UTC m=+187.145731757" watchObservedRunningTime="2026-04-20 14:56:35.415064396 +0000 UTC m=+187.146382682" Apr 20 14:56:36.241352 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.241323 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57b4bd5cff-588gf"] Apr 20 14:56:36.244558 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.244537 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.247804 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.247778 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 14:56:36.247804 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.247800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 14:56:36.248016 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.247814 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-4ptrz\"" Apr 20 14:56:36.248016 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.247803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 14:56:36.248016 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.247855 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-datpvn6jj0c97\"" Apr 20 14:56:36.248016 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.247784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 14:56:36.252018 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.252000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57b4bd5cff-588gf"] Apr 20 14:56:36.333328 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.333292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-client-ca-bundle\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.333485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.333351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fe9129b1-1735-446d-a04a-9eceeb28fad7-audit-log\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.333485 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.333446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt6ft\" (UniqueName: \"kubernetes.io/projected/fe9129b1-1735-446d-a04a-9eceeb28fad7-kube-api-access-jt6ft\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.333603 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.333481 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fe9129b1-1735-446d-a04a-9eceeb28fad7-metrics-server-audit-profiles\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.333603 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.333562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-secret-metrics-server-tls\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.333603 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.333592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-secret-metrics-server-client-certs\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.333697 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.333643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9129b1-1735-446d-a04a-9eceeb28fad7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.434644 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.434589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-secret-metrics-server-tls\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435127 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.434666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-secret-metrics-server-client-certs\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435127 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.434750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9129b1-1735-446d-a04a-9eceeb28fad7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435127 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.434807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-client-ca-bundle\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435127 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.434965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fe9129b1-1735-446d-a04a-9eceeb28fad7-audit-log\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435127 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.435084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt6ft\" (UniqueName: \"kubernetes.io/projected/fe9129b1-1735-446d-a04a-9eceeb28fad7-kube-api-access-jt6ft\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435127 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.435115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fe9129b1-1735-446d-a04a-9eceeb28fad7-metrics-server-audit-profiles\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435549 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.435503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fe9129b1-1735-446d-a04a-9eceeb28fad7-audit-log\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.435761 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.435710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9129b1-1735-446d-a04a-9eceeb28fad7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.436057 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.436036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fe9129b1-1735-446d-a04a-9eceeb28fad7-metrics-server-audit-profiles\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.437367 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.437344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-secret-metrics-server-client-certs\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.437575 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.437555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-client-ca-bundle\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.437627 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.437570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fe9129b1-1735-446d-a04a-9eceeb28fad7-secret-metrics-server-tls\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.442814 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.442794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt6ft\" (UniqueName: \"kubernetes.io/projected/fe9129b1-1735-446d-a04a-9eceeb28fad7-kube-api-access-jt6ft\") pod \"metrics-server-57b4bd5cff-588gf\" (UID: \"fe9129b1-1735-446d-a04a-9eceeb28fad7\") " pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.554975 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.554884 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:36.679025 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:36.678993 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57b4bd5cff-588gf"] Apr 20 14:56:36.682160 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:36.682131 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe9129b1_1735_446d_a04a_9eceeb28fad7.slice/crio-4b47e189f809e310f08abc549330b91600ee357ba31fb67c1521ea37c66ea664 WatchSource:0}: Error finding container 4b47e189f809e310f08abc549330b91600ee357ba31fb67c1521ea37c66ea664: Status 404 returned error can't find the container with id 4b47e189f809e310f08abc549330b91600ee357ba31fb67c1521ea37c66ea664 Apr 20 14:56:37.377799 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:37.377763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" event={"ID":"fe9129b1-1735-446d-a04a-9eceeb28fad7","Type":"ContainerStarted","Data":"4b47e189f809e310f08abc549330b91600ee357ba31fb67c1521ea37c66ea664"} Apr 20 14:56:37.861943 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:37.861910 2575 scope.go:117] "RemoveContainer" containerID="b7c36b2583a8df9b07b0f7df60ae1c7eb0c6950238acdc900687a6955ca17ebf" Apr 20 14:56:37.862349 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:37.862136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-b8vvq_openshift-console-operator(a3b8c0ca-6a14-4aa3-b779-8722694554e7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" podUID="a3b8c0ca-6a14-4aa3-b779-8722694554e7" Apr 20 14:56:38.016405 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.016368 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:56:38.022121 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.022091 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.024471 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.024440 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 14:56:38.024471 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.024451 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 14:56:38.024744 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.024727 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 14:56:38.024838 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.024753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 14:56:38.024838 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.024731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 14:56:38.026112 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.025502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8gc4sujm77lvt\"" Apr 20 14:56:38.026112 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.025502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 14:56:38.026112 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.025775 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 14:56:38.026112 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.025847 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-j5xw6\"" Apr 20 14:56:38.026112 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.025916 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 14:56:38.026112 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.025952 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 14:56:38.026112 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.026091 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 14:56:38.026469 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.026171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 14:56:38.028254 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.028188 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 14:56:38.033593 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.033571 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:56:38.053076 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053243 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053243 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqng\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-kube-api-access-ctqng\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053243 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053243 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053463 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053271 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053463 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053463 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053463 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053463 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053463 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config-out\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-web-config\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.053759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.054009 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.053807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.154751 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.154868 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.154868 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.154868 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155024 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config-out\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155024 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155024 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155024 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155024 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.154997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155254 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.155043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-web-config\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155254 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.155071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155254 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.155125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155254 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.155160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.155254 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.155225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157468 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.155929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157468 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.156113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157468 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.157088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157468 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.157420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157746 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.157553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqng\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-kube-api-access-ctqng\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157746 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.157596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157746 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.157626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157746 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.157663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.157974 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.157912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.158838 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.158813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.159084 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.159060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config-out\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.159378 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.159336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.159466 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.159412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.159588 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.159545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.160006 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.159968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.160288 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.160265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.160383 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.160336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.160654 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.160627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.160735 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.160652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.161808 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.161784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-web-config\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.161913 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.161891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.164445 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.164425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqng\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-kube-api-access-ctqng\") pod \"prometheus-k8s-0\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.335134 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.335087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:38.385215 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.383376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" event={"ID":"fe9129b1-1735-446d-a04a-9eceeb28fad7","Type":"ContainerStarted","Data":"3ae0a85883583c1c6aaa7a11e7a2e4276bd342cd7d9190a91b05921dacc6a2fe"} Apr 20 14:56:38.400971 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.400428 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" podStartSLOduration=0.952862074 podStartE2EDuration="2.400408338s" podCreationTimestamp="2026-04-20 14:56:36 +0000 UTC" firstStartedPulling="2026-04-20 14:56:36.68401374 +0000 UTC m=+188.415332006" lastFinishedPulling="2026-04-20 14:56:38.131559993 +0000 UTC m=+189.862878270" observedRunningTime="2026-04-20 14:56:38.399079417 +0000 UTC m=+190.130397704" watchObservedRunningTime="2026-04-20 14:56:38.400408338 +0000 UTC m=+190.131726637" Apr 20 14:56:38.488613 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:38.488578 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:56:38.491698 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:56:38.491673 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d571f54_ebd6_43c9_b57e_bcf21a6a8668.slice/crio-f9bbdcd45a9536cdde58135d8d05e4e988ecc6724d233cc9acb1840edaee15c8 WatchSource:0}: Error finding container f9bbdcd45a9536cdde58135d8d05e4e988ecc6724d233cc9acb1840edaee15c8: Status 404 returned error can't find the container with id f9bbdcd45a9536cdde58135d8d05e4e988ecc6724d233cc9acb1840edaee15c8 Apr 20 14:56:39.388216 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:39.388148 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerStarted","Data":"f9bbdcd45a9536cdde58135d8d05e4e988ecc6724d233cc9acb1840edaee15c8"} Apr 20 14:56:40.392107 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:40.392072 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" exitCode=0 Apr 20 14:56:40.392501 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:40.392165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c"} Apr 20 14:56:43.403575 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:43.403475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerStarted","Data":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} Apr 20 14:56:43.403575 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:43.403525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerStarted","Data":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} Apr 20 14:56:45.413239 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:45.413194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerStarted","Data":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} Apr 20 14:56:45.413239 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:45.413236 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerStarted","Data":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} Apr 20 14:56:45.413668 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:45.413249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerStarted","Data":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} Apr 20 14:56:45.413668 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:45.413261 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerStarted","Data":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} Apr 20 14:56:45.447293 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:45.447238 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.055864839 podStartE2EDuration="8.447224569s" podCreationTimestamp="2026-04-20 14:56:37 +0000 UTC" firstStartedPulling="2026-04-20 14:56:38.493977568 +0000 UTC m=+190.225295837" lastFinishedPulling="2026-04-20 14:56:44.885337301 +0000 UTC m=+196.616655567" observedRunningTime="2026-04-20 14:56:45.445491345 +0000 UTC m=+197.176809650" watchObservedRunningTime="2026-04-20 14:56:45.447224569 +0000 UTC m=+197.178542856" Apr 20 14:56:47.341781 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.341742 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d4b464978-2whf9"] Apr 20 14:56:47.342133 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:56:47.342012 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5d4b464978-2whf9" podUID="89ddc55e-4262-44ca-b737-16453dbd75de" Apr 20 14:56:47.418419 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.418389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:56:47.422706 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.422687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:56:47.551508 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551459 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-registry-certificates\") pod \"89ddc55e-4262-44ca-b737-16453dbd75de\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " Apr 20 14:56:47.551508 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551529 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-bound-sa-token\") pod \"89ddc55e-4262-44ca-b737-16453dbd75de\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " Apr 20 14:56:47.551742 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551602 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-installation-pull-secrets\") pod \"89ddc55e-4262-44ca-b737-16453dbd75de\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " Apr 20 14:56:47.551742 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551628 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89ddc55e-4262-44ca-b737-16453dbd75de-ca-trust-extracted\") pod \"89ddc55e-4262-44ca-b737-16453dbd75de\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " Apr 20 14:56:47.551742 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551657 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmgdl\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-kube-api-access-nmgdl\") pod \"89ddc55e-4262-44ca-b737-16453dbd75de\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " Apr 20 14:56:47.551742 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551688 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-trusted-ca\") pod \"89ddc55e-4262-44ca-b737-16453dbd75de\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " Apr 20 14:56:47.551742 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551732 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-image-registry-private-configuration\") pod \"89ddc55e-4262-44ca-b737-16453dbd75de\" (UID: \"89ddc55e-4262-44ca-b737-16453dbd75de\") " Apr 20 14:56:47.551974 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551853 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "89ddc55e-4262-44ca-b737-16453dbd75de" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:56:47.552025 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.551976 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ddc55e-4262-44ca-b737-16453dbd75de-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "89ddc55e-4262-44ca-b737-16453dbd75de" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:56:47.552139 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.552114 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "89ddc55e-4262-44ca-b737-16453dbd75de" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:56:47.552275 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.552146 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89ddc55e-4262-44ca-b737-16453dbd75de-ca-trust-extracted\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:47.552388 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.552293 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-registry-certificates\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:47.554007 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.553981 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "89ddc55e-4262-44ca-b737-16453dbd75de" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:56:47.554105 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.553988 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-kube-api-access-nmgdl" (OuterVolumeSpecName: "kube-api-access-nmgdl") pod "89ddc55e-4262-44ca-b737-16453dbd75de" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de"). InnerVolumeSpecName "kube-api-access-nmgdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:47.554105 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.554069 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "89ddc55e-4262-44ca-b737-16453dbd75de" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:47.554105 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.554085 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "89ddc55e-4262-44ca-b737-16453dbd75de" (UID: "89ddc55e-4262-44ca-b737-16453dbd75de"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:56:47.653783 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.653690 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-image-registry-private-configuration\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:47.653783 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.653722 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-bound-sa-token\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:47.653783 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.653741 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89ddc55e-4262-44ca-b737-16453dbd75de-installation-pull-secrets\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:47.653783 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.653751 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmgdl\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-kube-api-access-nmgdl\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:47.653783 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:47.653760 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89ddc55e-4262-44ca-b737-16453dbd75de-trusted-ca\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.335997 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:48.335960 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:48.421650 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:48.421617 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d4b464978-2whf9" Apr 20 14:56:48.452289 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:48.452250 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d4b464978-2whf9"] Apr 20 14:56:48.455599 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:48.455569 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d4b464978-2whf9"] Apr 20 14:56:48.563341 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:48.563300 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89ddc55e-4262-44ca-b737-16453dbd75de-registry-tls\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.862239 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:48.862213 2575 scope.go:117] "RemoveContainer" containerID="b7c36b2583a8df9b07b0f7df60ae1c7eb0c6950238acdc900687a6955ca17ebf" Apr 20 14:56:48.864187 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:48.864166 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ddc55e-4262-44ca-b737-16453dbd75de" path="/var/lib/kubelet/pods/89ddc55e-4262-44ca-b737-16453dbd75de/volumes" Apr 20 14:56:49.426972 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:49.426943 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 14:56:49.427376 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:49.427018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" event={"ID":"a3b8c0ca-6a14-4aa3-b779-8722694554e7","Type":"ContainerStarted","Data":"65df783466aac0f84ab525f890f6e6ec7b7bcf0e6008f804dfb40bea4d14ae12"} Apr 20 14:56:49.427376 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:49.427299 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:56:49.446142 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:49.446090 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" podStartSLOduration=52.040978174 podStartE2EDuration="54.446073216s" podCreationTimestamp="2026-04-20 14:55:55 +0000 UTC" firstStartedPulling="2026-04-20 14:55:56.418742096 +0000 UTC m=+148.150060377" lastFinishedPulling="2026-04-20 14:55:58.82383715 +0000 UTC m=+150.555155419" observedRunningTime="2026-04-20 14:56:49.444791281 +0000 UTC m=+201.176109566" watchObservedRunningTime="2026-04-20 14:56:49.446073216 +0000 UTC m=+201.177391527" Apr 20 14:56:49.592195 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:49.592163 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-b8vvq" Apr 20 14:56:56.555691 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:56.555655 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:56:56.555691 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:56:56.555699 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:57:07.478084 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:07.478049 2575 generic.go:358] "Generic (PLEG): container finished" podID="fce9aead-ae79-449d-9e77-55a7a14471b5" containerID="27d325333982c7eca8eeac3b203db6a2f14bbb740c559feca90691abcea834a3" exitCode=0 Apr 20 14:57:07.478453 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:07.478124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" event={"ID":"fce9aead-ae79-449d-9e77-55a7a14471b5","Type":"ContainerDied","Data":"27d325333982c7eca8eeac3b203db6a2f14bbb740c559feca90691abcea834a3"} Apr 20 14:57:07.478496 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:07.478462 2575 scope.go:117] "RemoveContainer" containerID="27d325333982c7eca8eeac3b203db6a2f14bbb740c559feca90691abcea834a3" Apr 20 14:57:08.482636 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:08.482605 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8l5tw" event={"ID":"fce9aead-ae79-449d-9e77-55a7a14471b5","Type":"ContainerStarted","Data":"6abc9889373fbdeb6802df179cad25be716ca6dd6ef0207fb1c2a12191951a91"} Apr 20 14:57:16.560738 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:16.560705 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:57:16.564576 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:16.564553 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57b4bd5cff-588gf" Apr 20 14:57:32.556813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:32.556778 2575 generic.go:358] "Generic (PLEG): container finished" podID="f5a93abc-e707-4adb-9942-2ed22b758d32" containerID="348d5004028c8794adaafb23528856aaa08e30851ae9ad2d92c3b1093350f5ab" exitCode=0 Apr 20 14:57:32.557330 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:32.556834 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" event={"ID":"f5a93abc-e707-4adb-9942-2ed22b758d32","Type":"ContainerDied","Data":"348d5004028c8794adaafb23528856aaa08e30851ae9ad2d92c3b1093350f5ab"} Apr 20 14:57:32.557330 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:32.557186 2575 scope.go:117] "RemoveContainer" containerID="348d5004028c8794adaafb23528856aaa08e30851ae9ad2d92c3b1093350f5ab" Apr 20 14:57:33.561939 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:33.561903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9qx5x" event={"ID":"f5a93abc-e707-4adb-9942-2ed22b758d32","Type":"ContainerStarted","Data":"17bf0367ceaa7b58117dc3aa9c2780803ffd6856643f37d775730d0429772f79"} Apr 20 14:57:38.335830 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:38.335797 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:38.354972 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:38.354946 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:38.591559 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:38.591469 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:40.720917 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:40.720881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:57:40.723195 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:40.723174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf83337-5403-4bd0-b782-5d5fa014368f-metrics-certs\") pod \"network-metrics-daemon-vbpm4\" (UID: \"aaf83337-5403-4bd0-b782-5d5fa014368f\") " pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:57:40.964164 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:40.964128 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ns2f2\"" Apr 20 14:57:40.972470 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:40.972422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vbpm4" Apr 20 14:57:41.103952 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:41.103920 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vbpm4"] Apr 20 14:57:41.108502 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:57:41.108474 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaf83337_5403_4bd0_b782_5d5fa014368f.slice/crio-3c868197a666eca657e871d5d5f07b177e7c9271c702cde9f4d82dcc4727122b WatchSource:0}: Error finding container 3c868197a666eca657e871d5d5f07b177e7c9271c702cde9f4d82dcc4727122b: Status 404 returned error can't find the container with id 3c868197a666eca657e871d5d5f07b177e7c9271c702cde9f4d82dcc4727122b Apr 20 14:57:41.587676 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:41.587633 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vbpm4" event={"ID":"aaf83337-5403-4bd0-b782-5d5fa014368f","Type":"ContainerStarted","Data":"3c868197a666eca657e871d5d5f07b177e7c9271c702cde9f4d82dcc4727122b"} Apr 20 14:57:42.591768 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:42.591732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vbpm4" event={"ID":"aaf83337-5403-4bd0-b782-5d5fa014368f","Type":"ContainerStarted","Data":"49ba395ff494bb0d92bddf1170246a92299381a2f2679d16e3df78d24d67eb51"} Apr 20 14:57:42.591768 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:42.591771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vbpm4" event={"ID":"aaf83337-5403-4bd0-b782-5d5fa014368f","Type":"ContainerStarted","Data":"a5df032d78eea03f93bb0755235e381175e662159b62a728b07f6d880df3496f"} Apr 20 14:57:42.607631 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:42.607583 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vbpm4" podStartSLOduration=252.576663172 podStartE2EDuration="4m13.607569544s" podCreationTimestamp="2026-04-20 14:53:29 +0000 UTC" firstStartedPulling="2026-04-20 14:57:41.110267867 +0000 UTC m=+252.841586134" lastFinishedPulling="2026-04-20 14:57:42.141174241 +0000 UTC m=+253.872492506" observedRunningTime="2026-04-20 14:57:42.606654696 +0000 UTC m=+254.337972996" watchObservedRunningTime="2026-04-20 14:57:42.607569544 +0000 UTC m=+254.338887844" Apr 20 14:57:56.356926 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.356829 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:57:56.357724 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.357488 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-web" containerID="cri-o://b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" gracePeriod=600 Apr 20 14:57:56.357724 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.357545 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="thanos-sidecar" containerID="cri-o://ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" gracePeriod=600 Apr 20 14:57:56.357724 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.357474 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="prometheus" containerID="cri-o://73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" gracePeriod=600 Apr 20 14:57:56.357724 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.357488 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-thanos" containerID="cri-o://14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" gracePeriod=600 Apr 20 14:57:56.357724 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.357539 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="config-reloader" containerID="cri-o://d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" gracePeriod=600 Apr 20 14:57:56.357724 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.357639 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy" containerID="cri-o://23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" gracePeriod=600 Apr 20 14:57:56.600144 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.600120 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:56.641334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641239 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" exitCode=0 Apr 20 14:57:56.641334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641269 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" exitCode=0 Apr 20 14:57:56.641334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641279 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" exitCode=0 Apr 20 14:57:56.641334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641287 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" exitCode=0 Apr 20 14:57:56.641334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641296 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" exitCode=0 Apr 20 14:57:56.641334 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641304 2575 generic.go:358] "Generic (PLEG): container finished" podID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" exitCode=0 Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641407 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641410 2575 scope.go:117] "RemoveContainer" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641437 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4d571f54-ebd6-43c9-b57e-bcf21a6a8668","Type":"ContainerDied","Data":"f9bbdcd45a9536cdde58135d8d05e4e988ecc6724d233cc9acb1840edaee15c8"} Apr 20 14:57:56.641813 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.641396 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:56.651831 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.651799 2575 scope.go:117] "RemoveContainer" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.661879 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.661771 2575 scope.go:117] "RemoveContainer" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.671416 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.671397 2575 scope.go:117] "RemoveContainer" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.678250 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.678229 2575 scope.go:117] "RemoveContainer" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.684678 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.684661 2575 scope.go:117] "RemoveContainer" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.691887 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.691867 2575 scope.go:117] "RemoveContainer" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.698307 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.698288 2575 scope.go:117] "RemoveContainer" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.698601 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:57:56.698581 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": container with ID starting with 14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae not found: ID does not exist" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.698673 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.698615 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} err="failed to get container status \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": rpc error: code = NotFound desc = could not find container \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": container with ID starting with 14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae not found: ID does not exist" Apr 20 14:57:56.698673 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.698663 2575 scope.go:117] "RemoveContainer" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.698898 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:57:56.698882 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": container with ID starting with 23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64 not found: ID does not exist" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.698948 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.698906 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} err="failed to get container status \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": rpc error: code = NotFound desc = could not find container \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": container with ID starting with 23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64 not found: ID does not exist" Apr 20 14:57:56.698948 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.698924 2575 scope.go:117] "RemoveContainer" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.699122 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:57:56.699107 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": container with ID starting with b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93 not found: ID does not exist" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.699164 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699125 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} err="failed to get container status \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": rpc error: code = NotFound desc = could not find container \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": container with ID starting with b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93 not found: ID does not exist" Apr 20 14:57:56.699164 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699139 2575 scope.go:117] "RemoveContainer" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.699335 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:57:56.699316 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": container with ID starting with ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d not found: ID does not exist" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.699398 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699344 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} err="failed to get container status \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": rpc error: code = NotFound desc = could not find container \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": container with ID starting with ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d not found: ID does not exist" Apr 20 14:57:56.699398 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699365 2575 scope.go:117] "RemoveContainer" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.699604 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:57:56.699584 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": container with ID starting with d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b not found: ID does not exist" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.699675 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699612 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} err="failed to get container status \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": rpc error: code = NotFound desc = could not find container \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": container with ID starting with d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b not found: ID does not exist" Apr 20 14:57:56.699675 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699630 2575 scope.go:117] "RemoveContainer" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.699897 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:57:56.699874 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": container with ID starting with 73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e not found: ID does not exist" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.699951 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699906 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} err="failed to get container status \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": rpc error: code = NotFound desc = could not find container \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": container with ID starting with 73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e not found: ID does not exist" Apr 20 14:57:56.699951 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.699921 2575 scope.go:117] "RemoveContainer" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.700137 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:57:56.700121 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": container with ID starting with 708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c not found: ID does not exist" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.700176 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700141 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c"} err="failed to get container status \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": rpc error: code = NotFound desc = could not find container \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": container with ID starting with 708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c not found: ID does not exist" Apr 20 14:57:56.700176 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700163 2575 scope.go:117] "RemoveContainer" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.700412 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700392 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} err="failed to get container status \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": rpc error: code = NotFound desc = could not find container \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": container with ID starting with 14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae not found: ID does not exist" Apr 20 14:57:56.700464 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700412 2575 scope.go:117] "RemoveContainer" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.700644 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700624 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} err="failed to get container status \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": rpc error: code = NotFound desc = could not find container \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": container with ID starting with 23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64 not found: ID does not exist" Apr 20 14:57:56.700718 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700646 2575 scope.go:117] "RemoveContainer" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.700865 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700847 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} err="failed to get container status \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": rpc error: code = NotFound desc = could not find container \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": container with ID starting with b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93 not found: ID does not exist" Apr 20 14:57:56.700914 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.700866 2575 scope.go:117] "RemoveContainer" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.701091 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701074 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} err="failed to get container status \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": rpc error: code = NotFound desc = could not find container \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": container with ID starting with ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d not found: ID does not exist" Apr 20 14:57:56.701091 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701089 2575 scope.go:117] "RemoveContainer" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.701293 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701276 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} err="failed to get container status \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": rpc error: code = NotFound desc = could not find container \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": container with ID starting with d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b not found: ID does not exist" Apr 20 14:57:56.701293 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701292 2575 scope.go:117] "RemoveContainer" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.701462 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701447 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} err="failed to get container status \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": rpc error: code = NotFound desc = could not find container \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": container with ID starting with 73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e not found: ID does not exist" Apr 20 14:57:56.701507 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701462 2575 scope.go:117] "RemoveContainer" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.701677 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701662 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c"} err="failed to get container status \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": rpc error: code = NotFound desc = could not find container \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": container with ID starting with 708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c not found: ID does not exist" Apr 20 14:57:56.701715 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701677 2575 scope.go:117] "RemoveContainer" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.701861 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701845 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} err="failed to get container status \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": rpc error: code = NotFound desc = could not find container \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": container with ID starting with 14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae not found: ID does not exist" Apr 20 14:57:56.701906 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.701860 2575 scope.go:117] "RemoveContainer" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.702027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702012 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} err="failed to get container status \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": rpc error: code = NotFound desc = could not find container \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": container with ID starting with 23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64 not found: ID does not exist" Apr 20 14:57:56.702027 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702027 2575 scope.go:117] "RemoveContainer" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.702196 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702176 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} err="failed to get container status \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": rpc error: code = NotFound desc = could not find container \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": container with ID starting with b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93 not found: ID does not exist" Apr 20 14:57:56.702236 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702199 2575 scope.go:117] "RemoveContainer" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.702397 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702380 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} err="failed to get container status \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": rpc error: code = NotFound desc = could not find container \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": container with ID starting with ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d not found: ID does not exist" Apr 20 14:57:56.702449 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702397 2575 scope.go:117] "RemoveContainer" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.702669 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702649 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} err="failed to get container status \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": rpc error: code = NotFound desc = could not find container \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": container with ID starting with d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b not found: ID does not exist" Apr 20 14:57:56.702669 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702669 2575 scope.go:117] "RemoveContainer" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.702901 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702883 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} err="failed to get container status \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": rpc error: code = NotFound desc = could not find container \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": container with ID starting with 73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e not found: ID does not exist" Apr 20 14:57:56.702951 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.702902 2575 scope.go:117] "RemoveContainer" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.703087 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703068 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c"} err="failed to get container status \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": rpc error: code = NotFound desc = could not find container \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": container with ID starting with 708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c not found: ID does not exist" Apr 20 14:57:56.703157 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703089 2575 scope.go:117] "RemoveContainer" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.703320 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703304 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} err="failed to get container status \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": rpc error: code = NotFound desc = could not find container \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": container with ID starting with 14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae not found: ID does not exist" Apr 20 14:57:56.703371 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703319 2575 scope.go:117] "RemoveContainer" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.703532 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703497 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} err="failed to get container status \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": rpc error: code = NotFound desc = could not find container \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": container with ID starting with 23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64 not found: ID does not exist" Apr 20 14:57:56.703629 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703617 2575 scope.go:117] "RemoveContainer" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.703835 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703817 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} err="failed to get container status \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": rpc error: code = NotFound desc = could not find container \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": container with ID starting with b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93 not found: ID does not exist" Apr 20 14:57:56.703909 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.703836 2575 scope.go:117] "RemoveContainer" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.704053 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704036 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} err="failed to get container status \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": rpc error: code = NotFound desc = could not find container \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": container with ID starting with ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d not found: ID does not exist" Apr 20 14:57:56.704103 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704055 2575 scope.go:117] "RemoveContainer" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.704245 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704231 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} err="failed to get container status \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": rpc error: code = NotFound desc = could not find container \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": container with ID starting with d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b not found: ID does not exist" Apr 20 14:57:56.704288 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704244 2575 scope.go:117] "RemoveContainer" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.704442 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704413 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} err="failed to get container status \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": rpc error: code = NotFound desc = could not find container \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": container with ID starting with 73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e not found: ID does not exist" Apr 20 14:57:56.704442 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704433 2575 scope.go:117] "RemoveContainer" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.704650 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704617 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c"} err="failed to get container status \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": rpc error: code = NotFound desc = could not find container \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": container with ID starting with 708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c not found: ID does not exist" Apr 20 14:57:56.704650 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704635 2575 scope.go:117] "RemoveContainer" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.704842 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704822 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} err="failed to get container status \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": rpc error: code = NotFound desc = could not find container \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": container with ID starting with 14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae not found: ID does not exist" Apr 20 14:57:56.704904 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.704844 2575 scope.go:117] "RemoveContainer" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.705073 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705048 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} err="failed to get container status \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": rpc error: code = NotFound desc = could not find container \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": container with ID starting with 23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64 not found: ID does not exist" Apr 20 14:57:56.705073 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705066 2575 scope.go:117] "RemoveContainer" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.705300 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705281 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} err="failed to get container status \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": rpc error: code = NotFound desc = could not find container \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": container with ID starting with b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93 not found: ID does not exist" Apr 20 14:57:56.705353 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705300 2575 scope.go:117] "RemoveContainer" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.705540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705503 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} err="failed to get container status \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": rpc error: code = NotFound desc = could not find container \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": container with ID starting with ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d not found: ID does not exist" Apr 20 14:57:56.705540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705539 2575 scope.go:117] "RemoveContainer" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.705756 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705733 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} err="failed to get container status \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": rpc error: code = NotFound desc = could not find container \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": container with ID starting with d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b not found: ID does not exist" Apr 20 14:57:56.705797 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705756 2575 scope.go:117] "RemoveContainer" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.705965 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705947 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} err="failed to get container status \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": rpc error: code = NotFound desc = could not find container \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": container with ID starting with 73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e not found: ID does not exist" Apr 20 14:57:56.706004 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.705966 2575 scope.go:117] "RemoveContainer" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.706210 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706187 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c"} err="failed to get container status \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": rpc error: code = NotFound desc = could not find container \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": container with ID starting with 708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c not found: ID does not exist" Apr 20 14:57:56.706210 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706210 2575 scope.go:117] "RemoveContainer" containerID="14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae" Apr 20 14:57:56.706428 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706411 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae"} err="failed to get container status \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": rpc error: code = NotFound desc = could not find container \"14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae\": container with ID starting with 14a8aeb8bdcbe50c19290e3c0c99d48289f067d0b2908d7a19fdd7762f18d4ae not found: ID does not exist" Apr 20 14:57:56.706473 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706429 2575 scope.go:117] "RemoveContainer" containerID="23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64" Apr 20 14:57:56.706671 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706649 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64"} err="failed to get container status \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": rpc error: code = NotFound desc = could not find container \"23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64\": container with ID starting with 23330f359bb7926947cd7e9aadb22f6c249fe30a25d854f4a6745d28c26ffe64 not found: ID does not exist" Apr 20 14:57:56.706671 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706669 2575 scope.go:117] "RemoveContainer" containerID="b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93" Apr 20 14:57:56.706880 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706864 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93"} err="failed to get container status \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": rpc error: code = NotFound desc = could not find container \"b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93\": container with ID starting with b500a0f3700565e2641bb278d4f6885c2d7bc5f4e8298612a78f44614b0aaa93 not found: ID does not exist" Apr 20 14:57:56.706880 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.706880 2575 scope.go:117] "RemoveContainer" containerID="ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d" Apr 20 14:57:56.707101 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.707083 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d"} err="failed to get container status \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": rpc error: code = NotFound desc = could not find container \"ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d\": container with ID starting with ad6c7f714df8ec10cabc501b6790fbf8769e7ee9545c2a576f3ae73ba667a89d not found: ID does not exist" Apr 20 14:57:56.707147 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.707101 2575 scope.go:117] "RemoveContainer" containerID="d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b" Apr 20 14:57:56.707330 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.707310 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b"} err="failed to get container status \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": rpc error: code = NotFound desc = could not find container \"d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b\": container with ID starting with d188c3e4ffb523eb17ca0791375446b7a02a3daa94c42184935c0924653bf00b not found: ID does not exist" Apr 20 14:57:56.707380 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.707331 2575 scope.go:117] "RemoveContainer" containerID="73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e" Apr 20 14:57:56.707530 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.707497 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e"} err="failed to get container status \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": rpc error: code = NotFound desc = could not find container \"73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e\": container with ID starting with 73c1742f4b77616dab0276e377af0f508d68e9cdf3e10c2c9b25362b8c3db04e not found: ID does not exist" Apr 20 14:57:56.707586 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.707534 2575 scope.go:117] "RemoveContainer" containerID="708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c" Apr 20 14:57:56.707712 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.707695 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c"} err="failed to get container status \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": rpc error: code = NotFound desc = could not find container \"708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c\": container with ID starting with 708e2055bc0b2ee6808d96d01107ddf2501bfa43fb239c967506d8781733633c not found: ID does not exist" Apr 20 14:57:56.758973 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.758941 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-kube-rbac-proxy\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.758973 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.758974 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-rulefiles-0\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759011 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-metrics-client-certs\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759034 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759068 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-tls-assets\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759091 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-tls\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759116 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759140 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-thanos-prometheus-http-client-file\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759167 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-kubelet-serving-ca-bundle\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.759203 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759198 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config-out\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759234 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctqng\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-kube-api-access-ctqng\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759291 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759329 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-serving-certs-ca-bundle\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759361 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-metrics-client-ca\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759408 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-trusted-ca-bundle\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759451 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-db\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759479 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-grpc-tls\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.759506 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-web-config\") pod \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\" (UID: \"4d571f54-ebd6-43c9-b57e-bcf21a6a8668\") " Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.760092 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.760469 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.760657 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.760690 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:56.761755 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.761602 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:56.762384 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.761836 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:57:56.763444 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763415 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.763563 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763500 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config" (OuterVolumeSpecName: "config") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.763563 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763556 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:56.763698 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763595 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.763757 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763699 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.763757 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.763855 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763762 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.763985 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.763961 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.764058 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.764041 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.764133 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.764112 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config-out" (OuterVolumeSpecName: "config-out") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:57:56.764941 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.764913 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-kube-api-access-ctqng" (OuterVolumeSpecName: "kube-api-access-ctqng") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "kube-api-access-ctqng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:56.772815 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.772789 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-web-config" (OuterVolumeSpecName: "web-config") pod "4d571f54-ebd6-43c9-b57e-bcf21a6a8668" (UID: "4d571f54-ebd6-43c9-b57e-bcf21a6a8668"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:56.860850 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860817 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.860850 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860846 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.860850 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860856 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-metrics-client-ca\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860866 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860875 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-db\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860884 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-grpc-tls\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860893 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-web-config\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860901 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-kube-rbac-proxy\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860909 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860918 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-metrics-client-certs\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860930 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860939 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-tls-assets\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860948 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860957 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860965 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860974 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860983 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-config-out\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.861064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.860993 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctqng\" (UniqueName: \"kubernetes.io/projected/4d571f54-ebd6-43c9-b57e-bcf21a6a8668-kube-api-access-ctqng\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 14:57:56.963842 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.961329 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:57:56.966006 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.965975 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:57:56.986062 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986039 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:57:56.986351 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986337 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="init-config-reloader" Apr 20 14:57:56.986410 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986353 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="init-config-reloader" Apr 20 14:57:56.986410 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986364 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="thanos-sidecar" Apr 20 14:57:56.986410 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986373 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="thanos-sidecar" Apr 20 14:57:56.986410 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986388 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy" Apr 20 14:57:56.986410 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986396 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy" Apr 20 14:57:56.986410 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986408 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="prometheus" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986416 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="prometheus" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986428 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="config-reloader" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986434 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="config-reloader" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986445 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-thanos" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986450 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-thanos" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986456 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-web" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986461 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-web" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986524 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986535 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="prometheus" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986543 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="thanos-sidecar" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986549 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-thanos" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986558 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="config-reloader" Apr 20 14:57:56.986623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.986568 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" containerName="kube-rbac-proxy-web" Apr 20 14:57:56.994678 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.994659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:56.997076 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997032 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 14:57:56.997076 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997060 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8gc4sujm77lvt\"" Apr 20 14:57:56.997427 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997410 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 14:57:56.997427 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997422 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 14:57:56.997589 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997455 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 14:57:56.997589 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 14:57:56.997589 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 14:57:56.997730 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-j5xw6\"" Apr 20 14:57:56.997730 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997648 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 14:57:56.997730 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997659 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 14:57:56.997904 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997889 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 14:57:56.998015 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:56.997999 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 14:57:57.000462 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.000445 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 14:57:57.004006 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.003567 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 14:57:57.005531 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.005488 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:57:57.164575 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164575 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164773 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164597 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-web-config\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164773 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-config\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164773 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34eda7fa-4142-48e1-920d-a1a1d107166c-config-out\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164773 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164773 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164773 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164962 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34eda7fa-4142-48e1-920d-a1a1d107166c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164962 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164962 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164962 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164962 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sd7p\" (UniqueName: \"kubernetes.io/projected/34eda7fa-4142-48e1-920d-a1a1d107166c-kube-api-access-2sd7p\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.164962 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.165126 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.164988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.165126 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.165011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.165126 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.165030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.165126 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.165057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266000 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.265921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266000 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.265959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266000 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.265978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-web-config\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-config\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34eda7fa-4142-48e1-920d-a1a1d107166c-config-out\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266230 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34eda7fa-4142-48e1-920d-a1a1d107166c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266681 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266681 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266681 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266681 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sd7p\" (UniqueName: \"kubernetes.io/projected/34eda7fa-4142-48e1-920d-a1a1d107166c-kube-api-access-2sd7p\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266681 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266681 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.266681 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.268956 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.266913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.268956 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.267622 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.268956 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.268136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.269201 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.269173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.269489 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.269432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-web-config\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.269604 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.269545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34eda7fa-4142-48e1-920d-a1a1d107166c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.269713 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.269688 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.269826 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.269806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.269954 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.269929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.270540 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.270499 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/34eda7fa-4142-48e1-920d-a1a1d107166c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.270898 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.270874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34eda7fa-4142-48e1-920d-a1a1d107166c-config-out\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.271129 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.271104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.271573 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.271550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-config\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.271682 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.271663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.271789 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.271771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.272475 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.272458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/34eda7fa-4142-48e1-920d-a1a1d107166c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.274082 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.274066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sd7p\" (UniqueName: \"kubernetes.io/projected/34eda7fa-4142-48e1-920d-a1a1d107166c-kube-api-access-2sd7p\") pod \"prometheus-k8s-0\" (UID: \"34eda7fa-4142-48e1-920d-a1a1d107166c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.307022 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.306998 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:57.430793 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.430763 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:57:57.433658 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:57:57.433632 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34eda7fa_4142_48e1_920d_a1a1d107166c.slice/crio-e1f9bc10357962ae20fba4f7018ab6125d9d9a2ae6860c403b8faf877f0347cb WatchSource:0}: Error finding container e1f9bc10357962ae20fba4f7018ab6125d9d9a2ae6860c403b8faf877f0347cb: Status 404 returned error can't find the container with id e1f9bc10357962ae20fba4f7018ab6125d9d9a2ae6860c403b8faf877f0347cb Apr 20 14:57:57.645915 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.645880 2575 generic.go:358] "Generic (PLEG): container finished" podID="34eda7fa-4142-48e1-920d-a1a1d107166c" containerID="2f9fcf5bcf4fbc8cac9730b34e89ca19a6a11af73f4a7aba51735956c91e97f3" exitCode=0 Apr 20 14:57:57.646068 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.645961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerDied","Data":"2f9fcf5bcf4fbc8cac9730b34e89ca19a6a11af73f4a7aba51735956c91e97f3"} Apr 20 14:57:57.646068 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:57.646003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerStarted","Data":"e1f9bc10357962ae20fba4f7018ab6125d9d9a2ae6860c403b8faf877f0347cb"} Apr 20 14:57:58.651634 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.651541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerStarted","Data":"dd72c7810c2ec62d24f7211b95c84688ca6d32b3e9e81e8184f561b8a4528e36"} Apr 20 14:57:58.651634 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.651573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerStarted","Data":"612ae7d9997241597adc78b7332ecf690bda7b22473d5520834bfc177d03776f"} Apr 20 14:57:58.651634 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.651582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerStarted","Data":"79838bf5e4b1d5c9361d77acb71c1c0a62ea5b95c7dde9d0750d19a17d55461a"} Apr 20 14:57:58.651634 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.651591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerStarted","Data":"48302c1840606c0532315b8209ce489e9ac84998e6a05b0811bdd9266175ace7"} Apr 20 14:57:58.651634 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.651600 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerStarted","Data":"b6c357a868b9aa1de32018fe0fff31026b1e188ce7a3d4bb0bbe1e759165692d"} Apr 20 14:57:58.651634 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.651607 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34eda7fa-4142-48e1-920d-a1a1d107166c","Type":"ContainerStarted","Data":"a77cfeea6293df209da2f02565ea519b314c6e0c46bf2825352ce87fce9f5f19"} Apr 20 14:57:58.677061 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.677005 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.676987601 podStartE2EDuration="2.676987601s" podCreationTimestamp="2026-04-20 14:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:57:58.674570423 +0000 UTC m=+270.405888736" watchObservedRunningTime="2026-04-20 14:57:58.676987601 +0000 UTC m=+270.408305891" Apr 20 14:57:58.865969 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:57:58.865933 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d571f54-ebd6-43c9-b57e-bcf21a6a8668" path="/var/lib/kubelet/pods/4d571f54-ebd6-43c9-b57e-bcf21a6a8668/volumes" Apr 20 14:58:02.307662 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:02.307612 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:58:07.254833 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:58:07.254790 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" podUID="6cda8435-e869-40a8-9726-f7b6d4767009" Apr 20 14:58:07.254833 ip-10-0-129-82 kubenswrapper[2575]: E0420 14:58:07.254813 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t7cf5" podUID="b3af7863-723b-45a3-8247-7e29b9a9da3c" Apr 20 14:58:07.678244 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:07.678156 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t7cf5" Apr 20 14:58:07.678244 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:07.678189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:58:11.281424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.281383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:58:11.281826 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.281446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:58:11.281826 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.281471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:58:11.283830 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.283809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3af7863-723b-45a3-8247-7e29b9a9da3c-metrics-tls\") pod \"dns-default-t7cf5\" (UID: \"b3af7863-723b-45a3-8247-7e29b9a9da3c\") " pod="openshift-dns/dns-default-t7cf5" Apr 20 14:58:11.283936 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.283862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a808e761-5c95-412e-a362-7e3ffb34caeb-cert\") pod \"ingress-canary-fccb4\" (UID: \"a808e761-5c95-412e-a362-7e3ffb34caeb\") " pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:58:11.283936 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.283910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6cda8435-e869-40a8-9726-f7b6d4767009-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5flsl\" (UID: \"6cda8435-e869-40a8-9726-f7b6d4767009\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:58:11.564456 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.564379 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jp5zp\"" Apr 20 14:58:11.571062 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.571045 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fccb4" Apr 20 14:58:11.582799 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.582770 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-75gpj\"" Apr 20 14:58:11.583003 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.582978 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rc7ph\"" Apr 20 14:58:11.590064 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.589874 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" Apr 20 14:58:11.590181 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.590103 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t7cf5" Apr 20 14:58:11.716424 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.716382 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fccb4"] Apr 20 14:58:11.722472 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:58:11.719444 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda808e761_5c95_412e_a362_7e3ffb34caeb.slice/crio-898f4361e45c40a86c8320095b8c5b1fec490f1344f664ed72de0235160fdbec WatchSource:0}: Error finding container 898f4361e45c40a86c8320095b8c5b1fec490f1344f664ed72de0235160fdbec: Status 404 returned error can't find the container with id 898f4361e45c40a86c8320095b8c5b1fec490f1344f664ed72de0235160fdbec Apr 20 14:58:11.737150 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.737129 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5flsl"] Apr 20 14:58:11.739305 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:58:11.739274 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cda8435_e869_40a8_9726_f7b6d4767009.slice/crio-81fdc065d855f0891a3940dfe04906e255a4f0357fdf22c8aa9d2c1fb6ce2327 WatchSource:0}: Error finding container 81fdc065d855f0891a3940dfe04906e255a4f0357fdf22c8aa9d2c1fb6ce2327: Status 404 returned error can't find the container with id 81fdc065d855f0891a3940dfe04906e255a4f0357fdf22c8aa9d2c1fb6ce2327 Apr 20 14:58:11.758432 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:11.758403 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t7cf5"] Apr 20 14:58:11.761617 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:58:11.761583 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3af7863_723b_45a3_8247_7e29b9a9da3c.slice/crio-115ac508fa22e4fae0a5a5f7dda5bbb76bcad9a0a1553124eb990a196489f0a2 WatchSource:0}: Error finding container 115ac508fa22e4fae0a5a5f7dda5bbb76bcad9a0a1553124eb990a196489f0a2: Status 404 returned error can't find the container with id 115ac508fa22e4fae0a5a5f7dda5bbb76bcad9a0a1553124eb990a196489f0a2 Apr 20 14:58:12.697358 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:12.697303 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t7cf5" event={"ID":"b3af7863-723b-45a3-8247-7e29b9a9da3c","Type":"ContainerStarted","Data":"115ac508fa22e4fae0a5a5f7dda5bbb76bcad9a0a1553124eb990a196489f0a2"} Apr 20 14:58:12.698779 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:12.698732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" event={"ID":"6cda8435-e869-40a8-9726-f7b6d4767009","Type":"ContainerStarted","Data":"81fdc065d855f0891a3940dfe04906e255a4f0357fdf22c8aa9d2c1fb6ce2327"} Apr 20 14:58:12.700159 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:12.700120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fccb4" event={"ID":"a808e761-5c95-412e-a362-7e3ffb34caeb","Type":"ContainerStarted","Data":"898f4361e45c40a86c8320095b8c5b1fec490f1344f664ed72de0235160fdbec"} Apr 20 14:58:14.709623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.709583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t7cf5" event={"ID":"b3af7863-723b-45a3-8247-7e29b9a9da3c","Type":"ContainerStarted","Data":"1052269537a473e873b05203dbb04be9b556562c0cb6e731f696185dcec4e955"} Apr 20 14:58:14.709623 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.709621 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t7cf5" event={"ID":"b3af7863-723b-45a3-8247-7e29b9a9da3c","Type":"ContainerStarted","Data":"f45c424abd3b7ec9e69a6095b8ca7834c0aa8ebef082149bd546444a4893d144"} Apr 20 14:58:14.710073 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.709712 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t7cf5" Apr 20 14:58:14.710951 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.710929 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" event={"ID":"6cda8435-e869-40a8-9726-f7b6d4767009","Type":"ContainerStarted","Data":"a757720973d25fc54541b0c3e3d408b50f0f46c00e0e7491dd783afb87a5f79f"} Apr 20 14:58:14.712079 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.712057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fccb4" event={"ID":"a808e761-5c95-412e-a362-7e3ffb34caeb","Type":"ContainerStarted","Data":"9467b7ef184857b005a171155f9e8f602ebe307db9d7597d921f4e6000fe4afc"} Apr 20 14:58:14.727161 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.727105 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t7cf5" podStartSLOduration=251.637933554 podStartE2EDuration="4m13.727086525s" podCreationTimestamp="2026-04-20 14:54:01 +0000 UTC" firstStartedPulling="2026-04-20 14:58:11.763289934 +0000 UTC m=+283.494608199" lastFinishedPulling="2026-04-20 14:58:13.852442888 +0000 UTC m=+285.583761170" observedRunningTime="2026-04-20 14:58:14.725005503 +0000 UTC m=+286.456323803" watchObservedRunningTime="2026-04-20 14:58:14.727086525 +0000 UTC m=+286.458404813" Apr 20 14:58:14.739010 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.738957 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5flsl" podStartSLOduration=269.629246769 podStartE2EDuration="4m31.738938269s" podCreationTimestamp="2026-04-20 14:53:43 +0000 UTC" firstStartedPulling="2026-04-20 14:58:11.74175419 +0000 UTC m=+283.473072470" lastFinishedPulling="2026-04-20 14:58:13.851445704 +0000 UTC m=+285.582763970" observedRunningTime="2026-04-20 14:58:14.738431207 +0000 UTC m=+286.469749510" watchObservedRunningTime="2026-04-20 14:58:14.738938269 +0000 UTC m=+286.470256558" Apr 20 14:58:14.753021 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:14.752960 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fccb4" podStartSLOduration=251.620334861 podStartE2EDuration="4m13.752943134s" podCreationTimestamp="2026-04-20 14:54:01 +0000 UTC" firstStartedPulling="2026-04-20 14:58:11.724565887 +0000 UTC m=+283.455884156" lastFinishedPulling="2026-04-20 14:58:13.857174164 +0000 UTC m=+285.588492429" observedRunningTime="2026-04-20 14:58:14.752422605 +0000 UTC m=+286.483740894" watchObservedRunningTime="2026-04-20 14:58:14.752943134 +0000 UTC m=+286.484261424" Apr 20 14:58:24.717460 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:24.717429 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t7cf5" Apr 20 14:58:28.791366 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:28.791334 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 14:58:28.791883 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:28.791866 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 14:58:28.797369 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:28.797344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 14:58:28.797771 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:28.797742 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 14:58:57.307879 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:57.307819 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:58:57.323649 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:57.323622 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:58:57.857677 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:58:57.857648 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:59:21.341874 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.341772 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-87cfj"] Apr 20 14:59:21.345114 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.345089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.347471 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.347449 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 14:59:21.348378 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.348363 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 14:59:21.348473 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.348415 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-6rgvl\"" Apr 20 14:59:21.353019 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.352997 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-87cfj"] Apr 20 14:59:21.492663 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.492622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/faadef63-6f45-4f1d-8758-1b319b6d0340-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-87cfj\" (UID: \"faadef63-6f45-4f1d-8758-1b319b6d0340\") " pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.492845 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.492772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9kjn\" (UniqueName: \"kubernetes.io/projected/faadef63-6f45-4f1d-8758-1b319b6d0340-kube-api-access-m9kjn\") pod \"cert-manager-webhook-597b96b99b-87cfj\" (UID: \"faadef63-6f45-4f1d-8758-1b319b6d0340\") " pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.593893 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.593805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9kjn\" (UniqueName: \"kubernetes.io/projected/faadef63-6f45-4f1d-8758-1b319b6d0340-kube-api-access-m9kjn\") pod \"cert-manager-webhook-597b96b99b-87cfj\" (UID: \"faadef63-6f45-4f1d-8758-1b319b6d0340\") " pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.593893 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.593843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/faadef63-6f45-4f1d-8758-1b319b6d0340-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-87cfj\" (UID: \"faadef63-6f45-4f1d-8758-1b319b6d0340\") " pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.602207 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.602181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/faadef63-6f45-4f1d-8758-1b319b6d0340-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-87cfj\" (UID: \"faadef63-6f45-4f1d-8758-1b319b6d0340\") " pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.602338 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.602319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9kjn\" (UniqueName: \"kubernetes.io/projected/faadef63-6f45-4f1d-8758-1b319b6d0340-kube-api-access-m9kjn\") pod \"cert-manager-webhook-597b96b99b-87cfj\" (UID: \"faadef63-6f45-4f1d-8758-1b319b6d0340\") " pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.664382 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.664331 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:21.786834 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.786807 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-87cfj"] Apr 20 14:59:21.789657 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:59:21.789621 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaadef63_6f45_4f1d_8758_1b319b6d0340.slice/crio-a66ddbd888c62eec88c18813e7ed0944e5a548ad1953556e8529f3fcbff61b37 WatchSource:0}: Error finding container a66ddbd888c62eec88c18813e7ed0944e5a548ad1953556e8529f3fcbff61b37: Status 404 returned error can't find the container with id a66ddbd888c62eec88c18813e7ed0944e5a548ad1953556e8529f3fcbff61b37 Apr 20 14:59:21.791572 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.791550 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:59:21.907936 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:21.907846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" event={"ID":"faadef63-6f45-4f1d-8758-1b319b6d0340","Type":"ContainerStarted","Data":"a66ddbd888c62eec88c18813e7ed0944e5a548ad1953556e8529f3fcbff61b37"} Apr 20 14:59:25.923060 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:25.923026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" event={"ID":"faadef63-6f45-4f1d-8758-1b319b6d0340","Type":"ContainerStarted","Data":"bbb8d5a2ce4e580986c42eca9770aabe72bb8ba0891764484f206fb6ed652681"} Apr 20 14:59:25.923461 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:25.923083 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:25.939411 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:25.939356 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" podStartSLOduration=1.319711618 podStartE2EDuration="4.939341706s" podCreationTimestamp="2026-04-20 14:59:21 +0000 UTC" firstStartedPulling="2026-04-20 14:59:21.791722892 +0000 UTC m=+353.523041157" lastFinishedPulling="2026-04-20 14:59:25.411352976 +0000 UTC m=+357.142671245" observedRunningTime="2026-04-20 14:59:25.937831177 +0000 UTC m=+357.669149466" watchObservedRunningTime="2026-04-20 14:59:25.939341706 +0000 UTC m=+357.670659994" Apr 20 14:59:31.343073 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.343031 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-lmfxk"] Apr 20 14:59:31.347111 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.347080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.351561 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.351481 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-j4st8\"" Apr 20 14:59:31.357622 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.357590 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-lmfxk"] Apr 20 14:59:31.487968 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.487926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/715a6659-c50a-42aa-bb64-ee5777f3372e-bound-sa-token\") pod \"cert-manager-759f64656b-lmfxk\" (UID: \"715a6659-c50a-42aa-bb64-ee5777f3372e\") " pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.488151 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.487986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvtp\" (UniqueName: \"kubernetes.io/projected/715a6659-c50a-42aa-bb64-ee5777f3372e-kube-api-access-jkvtp\") pod \"cert-manager-759f64656b-lmfxk\" (UID: \"715a6659-c50a-42aa-bb64-ee5777f3372e\") " pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.588886 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.588840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/715a6659-c50a-42aa-bb64-ee5777f3372e-bound-sa-token\") pod \"cert-manager-759f64656b-lmfxk\" (UID: \"715a6659-c50a-42aa-bb64-ee5777f3372e\") " pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.588886 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.588895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvtp\" (UniqueName: \"kubernetes.io/projected/715a6659-c50a-42aa-bb64-ee5777f3372e-kube-api-access-jkvtp\") pod \"cert-manager-759f64656b-lmfxk\" (UID: \"715a6659-c50a-42aa-bb64-ee5777f3372e\") " pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.598555 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.598451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvtp\" (UniqueName: \"kubernetes.io/projected/715a6659-c50a-42aa-bb64-ee5777f3372e-kube-api-access-jkvtp\") pod \"cert-manager-759f64656b-lmfxk\" (UID: \"715a6659-c50a-42aa-bb64-ee5777f3372e\") " pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.598877 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.598858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/715a6659-c50a-42aa-bb64-ee5777f3372e-bound-sa-token\") pod \"cert-manager-759f64656b-lmfxk\" (UID: \"715a6659-c50a-42aa-bb64-ee5777f3372e\") " pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.662861 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.662814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-lmfxk" Apr 20 14:59:31.793088 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.793043 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-lmfxk"] Apr 20 14:59:31.797257 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:59:31.797219 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod715a6659_c50a_42aa_bb64_ee5777f3372e.slice/crio-c4d71e34416689aa59dc3f450e2039d95188946d8527aa7d4ddb763fdbb28041 WatchSource:0}: Error finding container c4d71e34416689aa59dc3f450e2039d95188946d8527aa7d4ddb763fdbb28041: Status 404 returned error can't find the container with id c4d71e34416689aa59dc3f450e2039d95188946d8527aa7d4ddb763fdbb28041 Apr 20 14:59:31.928639 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.928606 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-87cfj" Apr 20 14:59:31.941435 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.941394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-lmfxk" event={"ID":"715a6659-c50a-42aa-bb64-ee5777f3372e","Type":"ContainerStarted","Data":"068788aaf2ca7c808337bf37590d740ef411f8d4280f748873cf7b1ef63772bf"} Apr 20 14:59:31.941435 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.941440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-lmfxk" event={"ID":"715a6659-c50a-42aa-bb64-ee5777f3372e","Type":"ContainerStarted","Data":"c4d71e34416689aa59dc3f450e2039d95188946d8527aa7d4ddb763fdbb28041"} Apr 20 14:59:31.957533 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:31.957455 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-lmfxk" podStartSLOduration=0.957436978 podStartE2EDuration="957.436978ms" podCreationTimestamp="2026-04-20 14:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:59:31.955101086 +0000 UTC m=+363.686419367" watchObservedRunningTime="2026-04-20 14:59:31.957436978 +0000 UTC m=+363.688755267" Apr 20 14:59:51.081231 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.081187 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q"] Apr 20 14:59:51.087569 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.087539 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.089881 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.089849 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 14:59:51.090022 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.089878 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 14:59:51.090022 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.089908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-525vc\"" Apr 20 14:59:51.090173 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.090157 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 14:59:51.090247 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.090232 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 14:59:51.098759 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.098734 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q"] Apr 20 14:59:51.159774 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.159730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1301f538-12c6-4361-8cfa-37d9a2f7f4be-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.159966 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.159833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1301f538-12c6-4361-8cfa-37d9a2f7f4be-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.159966 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.159898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9cwh\" (UniqueName: \"kubernetes.io/projected/1301f538-12c6-4361-8cfa-37d9a2f7f4be-kube-api-access-t9cwh\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.261136 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.261099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9cwh\" (UniqueName: \"kubernetes.io/projected/1301f538-12c6-4361-8cfa-37d9a2f7f4be-kube-api-access-t9cwh\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.261320 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.261152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1301f538-12c6-4361-8cfa-37d9a2f7f4be-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.261320 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.261190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1301f538-12c6-4361-8cfa-37d9a2f7f4be-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.263653 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.263629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1301f538-12c6-4361-8cfa-37d9a2f7f4be-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.263745 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.263678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1301f538-12c6-4361-8cfa-37d9a2f7f4be-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.270621 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.270600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9cwh\" (UniqueName: \"kubernetes.io/projected/1301f538-12c6-4361-8cfa-37d9a2f7f4be-kube-api-access-t9cwh\") pod \"opendatahub-operator-controller-manager-854569cf8c-zr74q\" (UID: \"1301f538-12c6-4361-8cfa-37d9a2f7f4be\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.398503 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.398409 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:51.523950 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:51.523913 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q"] Apr 20 14:59:51.527881 ip-10-0-129-82 kubenswrapper[2575]: W0420 14:59:51.527854 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1301f538_12c6_4361_8cfa_37d9a2f7f4be.slice/crio-7b6ad70194b6c4b1efdde3968d6fd6312d5f53bbbfac953c6c659a2e56a9290a WatchSource:0}: Error finding container 7b6ad70194b6c4b1efdde3968d6fd6312d5f53bbbfac953c6c659a2e56a9290a: Status 404 returned error can't find the container with id 7b6ad70194b6c4b1efdde3968d6fd6312d5f53bbbfac953c6c659a2e56a9290a Apr 20 14:59:52.003400 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:52.003359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" event={"ID":"1301f538-12c6-4361-8cfa-37d9a2f7f4be","Type":"ContainerStarted","Data":"7b6ad70194b6c4b1efdde3968d6fd6312d5f53bbbfac953c6c659a2e56a9290a"} Apr 20 14:59:55.015750 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:55.015711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" event={"ID":"1301f538-12c6-4361-8cfa-37d9a2f7f4be","Type":"ContainerStarted","Data":"01aa482e801018a14f0d17bde861d1e6d3d6ee12cf887670c5a6982da883d84f"} Apr 20 14:59:55.016152 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:55.015871 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 14:59:55.036489 ip-10-0-129-82 kubenswrapper[2575]: I0420 14:59:55.036433 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" podStartSLOduration=1.491422054 podStartE2EDuration="4.036415535s" podCreationTimestamp="2026-04-20 14:59:51 +0000 UTC" firstStartedPulling="2026-04-20 14:59:51.529742983 +0000 UTC m=+383.261061249" lastFinishedPulling="2026-04-20 14:59:54.07473645 +0000 UTC m=+385.806054730" observedRunningTime="2026-04-20 14:59:55.034208821 +0000 UTC m=+386.765527108" watchObservedRunningTime="2026-04-20 14:59:55.036415535 +0000 UTC m=+386.767733822" Apr 20 15:00:04.396439 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.396377 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d"] Apr 20 15:00:04.401010 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.400979 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.404441 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.404409 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lj4z8\"" Apr 20 15:00:04.404643 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.404452 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 15:00:04.404643 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.404473 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 15:00:04.404643 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.404537 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:00:04.404643 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.404477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 15:00:04.404643 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.404623 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 15:00:04.407886 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.407852 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d"] Apr 20 15:00:04.483605 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.483552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/375e227b-169e-4f82-8fc0-f666eb13f899-metrics-cert\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.483821 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.483619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375e227b-169e-4f82-8fc0-f666eb13f899-cert\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.483821 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.483652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/375e227b-169e-4f82-8fc0-f666eb13f899-manager-config\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.483821 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.483772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tqf6\" (UniqueName: \"kubernetes.io/projected/375e227b-169e-4f82-8fc0-f666eb13f899-kube-api-access-4tqf6\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.584934 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.584894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tqf6\" (UniqueName: \"kubernetes.io/projected/375e227b-169e-4f82-8fc0-f666eb13f899-kube-api-access-4tqf6\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.585145 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.584947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/375e227b-169e-4f82-8fc0-f666eb13f899-metrics-cert\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.585145 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.584984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375e227b-169e-4f82-8fc0-f666eb13f899-cert\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.585145 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.585003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/375e227b-169e-4f82-8fc0-f666eb13f899-manager-config\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.585661 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.585632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/375e227b-169e-4f82-8fc0-f666eb13f899-manager-config\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.587632 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.587612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/375e227b-169e-4f82-8fc0-f666eb13f899-cert\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.587783 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.587763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/375e227b-169e-4f82-8fc0-f666eb13f899-metrics-cert\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.596206 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.596176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tqf6\" (UniqueName: \"kubernetes.io/projected/375e227b-169e-4f82-8fc0-f666eb13f899-kube-api-access-4tqf6\") pod \"lws-controller-manager-54f6c466b9-rdk9d\" (UID: \"375e227b-169e-4f82-8fc0-f666eb13f899\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.713637 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.713588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:04.841695 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:04.841649 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d"] Apr 20 15:00:04.844631 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:00:04.844601 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375e227b_169e_4f82_8fc0_f666eb13f899.slice/crio-cdb3a414836bc7567a0f9955cc916c830a9f991c66114cedb1b9c5be33dadda4 WatchSource:0}: Error finding container cdb3a414836bc7567a0f9955cc916c830a9f991c66114cedb1b9c5be33dadda4: Status 404 returned error can't find the container with id cdb3a414836bc7567a0f9955cc916c830a9f991c66114cedb1b9c5be33dadda4 Apr 20 15:00:05.048890 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:05.048798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" event={"ID":"375e227b-169e-4f82-8fc0-f666eb13f899","Type":"ContainerStarted","Data":"cdb3a414836bc7567a0f9955cc916c830a9f991c66114cedb1b9c5be33dadda4"} Apr 20 15:00:06.022345 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:06.022311 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-zr74q" Apr 20 15:00:08.638915 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.638875 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q"] Apr 20 15:00:08.643803 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.643776 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.646170 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.646146 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 15:00:08.647263 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.647243 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 15:00:08.647412 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.647242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 15:00:08.647618 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.647252 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-m2765\"" Apr 20 15:00:08.647723 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.647314 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 15:00:08.649844 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.649823 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q"] Apr 20 15:00:08.825198 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.825164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95518758-6800-47d6-a2b2-133367bc4bf8-tmp\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.825366 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.825228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zwpt\" (UniqueName: \"kubernetes.io/projected/95518758-6800-47d6-a2b2-133367bc4bf8-kube-api-access-9zwpt\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.825366 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.825276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95518758-6800-47d6-a2b2-133367bc4bf8-tls-certs\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.926500 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.926395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95518758-6800-47d6-a2b2-133367bc4bf8-tls-certs\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.926500 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.926461 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95518758-6800-47d6-a2b2-133367bc4bf8-tmp\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.926774 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.926542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zwpt\" (UniqueName: \"kubernetes.io/projected/95518758-6800-47d6-a2b2-133367bc4bf8-kube-api-access-9zwpt\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.928827 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.928799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95518758-6800-47d6-a2b2-133367bc4bf8-tmp\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.929038 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.929011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95518758-6800-47d6-a2b2-133367bc4bf8-tls-certs\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.934406 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.934369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zwpt\" (UniqueName: \"kubernetes.io/projected/95518758-6800-47d6-a2b2-133367bc4bf8-kube-api-access-9zwpt\") pod \"kube-auth-proxy-597dfdc786-kfh7q\" (UID: \"95518758-6800-47d6-a2b2-133367bc4bf8\") " pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:08.955254 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:08.955216 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" Apr 20 15:00:09.090686 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:09.090649 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q"] Apr 20 15:00:09.094054 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:00:09.094022 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95518758_6800_47d6_a2b2_133367bc4bf8.slice/crio-b83564975925134fae5513b9e298aa305b2bbf32fb093b5e457d658320248121 WatchSource:0}: Error finding container b83564975925134fae5513b9e298aa305b2bbf32fb093b5e457d658320248121: Status 404 returned error can't find the container with id b83564975925134fae5513b9e298aa305b2bbf32fb093b5e457d658320248121 Apr 20 15:00:10.068253 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:10.068214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" event={"ID":"95518758-6800-47d6-a2b2-133367bc4bf8","Type":"ContainerStarted","Data":"b83564975925134fae5513b9e298aa305b2bbf32fb093b5e457d658320248121"} Apr 20 15:00:14.085209 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:14.085167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" event={"ID":"95518758-6800-47d6-a2b2-133367bc4bf8","Type":"ContainerStarted","Data":"8ca6800d5660eef42632c7b5978311066997afaa6be19f17eca6cd4fb9108fcc"} Apr 20 15:00:14.086605 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:14.086564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" event={"ID":"375e227b-169e-4f82-8fc0-f666eb13f899","Type":"ContainerStarted","Data":"10df0f532c26c599ff7e46930a28ef726710fa00c4bdcdb10691ba727bbb8566"} Apr 20 15:00:14.086748 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:14.086647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:00:14.101590 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:14.101540 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-597dfdc786-kfh7q" podStartSLOduration=2.181405489 podStartE2EDuration="6.101503643s" podCreationTimestamp="2026-04-20 15:00:08 +0000 UTC" firstStartedPulling="2026-04-20 15:00:09.095830539 +0000 UTC m=+400.827148805" lastFinishedPulling="2026-04-20 15:00:13.015928689 +0000 UTC m=+404.747246959" observedRunningTime="2026-04-20 15:00:14.099819754 +0000 UTC m=+405.831138042" watchObservedRunningTime="2026-04-20 15:00:14.101503643 +0000 UTC m=+405.832821931" Apr 20 15:00:14.115841 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:14.115792 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" podStartSLOduration=1.954017573 podStartE2EDuration="10.115775615s" podCreationTimestamp="2026-04-20 15:00:04 +0000 UTC" firstStartedPulling="2026-04-20 15:00:04.846449472 +0000 UTC m=+396.577767737" lastFinishedPulling="2026-04-20 15:00:13.00820751 +0000 UTC m=+404.739525779" observedRunningTime="2026-04-20 15:00:14.113905617 +0000 UTC m=+405.845223906" watchObservedRunningTime="2026-04-20 15:00:14.115775615 +0000 UTC m=+405.847093904" Apr 20 15:00:25.091803 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:00:25.091767 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-rdk9d" Apr 20 15:02:28.895539 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.895494 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-xhjn4"] Apr 20 15:02:28.898779 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.898762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:28.900995 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.900973 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:02:28.901208 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.901186 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2pfqw\"" Apr 20 15:02:28.901925 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.901907 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 15:02:28.902032 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.901941 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:02:28.905359 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.905337 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-xhjn4"] Apr 20 15:02:28.995905 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:28.995874 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-xhjn4"] Apr 20 15:02:29.053819 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.053783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-config-file\") pod \"limitador-limitador-7d549b5b-xhjn4\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:29.053989 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.053867 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vwm\" (UniqueName: \"kubernetes.io/projected/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-kube-api-access-87vwm\") pod \"limitador-limitador-7d549b5b-xhjn4\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:29.155212 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.155124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-config-file\") pod \"limitador-limitador-7d549b5b-xhjn4\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:29.155358 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.155220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87vwm\" (UniqueName: \"kubernetes.io/projected/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-kube-api-access-87vwm\") pod \"limitador-limitador-7d549b5b-xhjn4\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:29.155781 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.155763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-config-file\") pod \"limitador-limitador-7d549b5b-xhjn4\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:29.164168 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.164144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vwm\" (UniqueName: \"kubernetes.io/projected/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-kube-api-access-87vwm\") pod \"limitador-limitador-7d549b5b-xhjn4\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:29.211276 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.211249 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:29.331083 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.331054 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-xhjn4"] Apr 20 15:02:29.334245 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:02:29.334216 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477a69f2_99fd_4b97_9c68_bf46ac6b1bb9.slice/crio-bc5d5e75881dc99716d2c590dfdfba0e56d9631e7f8c0bc200f319b39b8f6347 WatchSource:0}: Error finding container bc5d5e75881dc99716d2c590dfdfba0e56d9631e7f8c0bc200f319b39b8f6347: Status 404 returned error can't find the container with id bc5d5e75881dc99716d2c590dfdfba0e56d9631e7f8c0bc200f319b39b8f6347 Apr 20 15:02:29.538397 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:29.538360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" event={"ID":"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9","Type":"ContainerStarted","Data":"bc5d5e75881dc99716d2c590dfdfba0e56d9631e7f8c0bc200f319b39b8f6347"} Apr 20 15:02:32.549091 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:32.549057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" event={"ID":"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9","Type":"ContainerStarted","Data":"c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6"} Apr 20 15:02:32.549478 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:32.549177 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:32.565227 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:32.565175 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" podStartSLOduration=1.848759365 podStartE2EDuration="4.56516022s" podCreationTimestamp="2026-04-20 15:02:28 +0000 UTC" firstStartedPulling="2026-04-20 15:02:29.335960893 +0000 UTC m=+541.067279171" lastFinishedPulling="2026-04-20 15:02:32.052361761 +0000 UTC m=+543.783680026" observedRunningTime="2026-04-20 15:02:32.562755067 +0000 UTC m=+544.294073353" watchObservedRunningTime="2026-04-20 15:02:32.56516022 +0000 UTC m=+544.296478506" Apr 20 15:02:43.553890 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:43.553857 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:45.813679 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:45.813643 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-xhjn4"] Apr 20 15:02:45.814141 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:45.813852 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" podUID="477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" containerName="limitador" containerID="cri-o://c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6" gracePeriod=30 Apr 20 15:02:46.350991 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.350963 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:46.410155 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.410056 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-config-file\") pod \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " Apr 20 15:02:46.410155 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.410123 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87vwm\" (UniqueName: \"kubernetes.io/projected/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-kube-api-access-87vwm\") pod \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\" (UID: \"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9\") " Apr 20 15:02:46.410423 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.410399 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-config-file" (OuterVolumeSpecName: "config-file") pod "477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" (UID: "477a69f2-99fd-4b97-9c68-bf46ac6b1bb9"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:02:46.412280 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.412254 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-kube-api-access-87vwm" (OuterVolumeSpecName: "kube-api-access-87vwm") pod "477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" (UID: "477a69f2-99fd-4b97-9c68-bf46ac6b1bb9"). InnerVolumeSpecName "kube-api-access-87vwm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:46.510818 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.510780 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87vwm\" (UniqueName: \"kubernetes.io/projected/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-kube-api-access-87vwm\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:02:46.510818 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.510814 2575 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9-config-file\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:02:46.595293 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.595253 2575 generic.go:358] "Generic (PLEG): container finished" podID="477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" containerID="c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6" exitCode=0 Apr 20 15:02:46.595463 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.595325 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" Apr 20 15:02:46.595463 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.595324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" event={"ID":"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9","Type":"ContainerDied","Data":"c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6"} Apr 20 15:02:46.595463 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.595360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-xhjn4" event={"ID":"477a69f2-99fd-4b97-9c68-bf46ac6b1bb9","Type":"ContainerDied","Data":"bc5d5e75881dc99716d2c590dfdfba0e56d9631e7f8c0bc200f319b39b8f6347"} Apr 20 15:02:46.595463 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.595379 2575 scope.go:117] "RemoveContainer" containerID="c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6" Apr 20 15:02:46.604004 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.603985 2575 scope.go:117] "RemoveContainer" containerID="c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6" Apr 20 15:02:46.604272 ip-10-0-129-82 kubenswrapper[2575]: E0420 15:02:46.604253 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6\": container with ID starting with c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6 not found: ID does not exist" containerID="c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6" Apr 20 15:02:46.604330 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.604285 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6"} err="failed to get container status \"c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6\": rpc error: code = NotFound desc = could not find container \"c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6\": container with ID starting with c7c3aefdb9e0ca7ffd334e840f9c49cc7c69f3bc3fbbbd79b28f7fdc034ce6a6 not found: ID does not exist" Apr 20 15:02:46.615524 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.615481 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-xhjn4"] Apr 20 15:02:46.620959 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.620913 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-xhjn4"] Apr 20 15:02:46.864892 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:46.864858 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" path="/var/lib/kubelet/pods/477a69f2-99fd-4b97-9c68-bf46ac6b1bb9/volumes" Apr 20 15:02:50.030006 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.029972 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-rm7ll"] Apr 20 15:02:50.030374 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.030313 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" containerName="limitador" Apr 20 15:02:50.030374 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.030324 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" containerName="limitador" Apr 20 15:02:50.030374 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.030371 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="477a69f2-99fd-4b97-9c68-bf46ac6b1bb9" containerName="limitador" Apr 20 15:02:50.033310 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.033287 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.035566 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.035542 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 15:02:50.035779 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.035756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-jtwgx\"" Apr 20 15:02:50.042948 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.042920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/94d98787-7dcd-442b-bfea-1e1cb9833889-data\") pod \"postgres-868db5846d-rm7ll\" (UID: \"94d98787-7dcd-442b-bfea-1e1cb9833889\") " pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.043211 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.043188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54b4d\" (UniqueName: \"kubernetes.io/projected/94d98787-7dcd-442b-bfea-1e1cb9833889-kube-api-access-54b4d\") pod \"postgres-868db5846d-rm7ll\" (UID: \"94d98787-7dcd-442b-bfea-1e1cb9833889\") " pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.045733 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.045711 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rm7ll"] Apr 20 15:02:50.144435 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.144390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54b4d\" (UniqueName: \"kubernetes.io/projected/94d98787-7dcd-442b-bfea-1e1cb9833889-kube-api-access-54b4d\") pod \"postgres-868db5846d-rm7ll\" (UID: \"94d98787-7dcd-442b-bfea-1e1cb9833889\") " pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.144648 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.144480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/94d98787-7dcd-442b-bfea-1e1cb9833889-data\") pod \"postgres-868db5846d-rm7ll\" (UID: \"94d98787-7dcd-442b-bfea-1e1cb9833889\") " pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.144914 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.144895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/94d98787-7dcd-442b-bfea-1e1cb9833889-data\") pod \"postgres-868db5846d-rm7ll\" (UID: \"94d98787-7dcd-442b-bfea-1e1cb9833889\") " pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.152449 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.152414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54b4d\" (UniqueName: \"kubernetes.io/projected/94d98787-7dcd-442b-bfea-1e1cb9833889-kube-api-access-54b4d\") pod \"postgres-868db5846d-rm7ll\" (UID: \"94d98787-7dcd-442b-bfea-1e1cb9833889\") " pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.344887 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.344787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:50.473254 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.473194 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rm7ll"] Apr 20 15:02:50.476414 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:02:50.476380 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d98787_7dcd_442b_bfea_1e1cb9833889.slice/crio-604db62ff1ff06356e66933cc436d0587dbfba63fbb91420efd4d313883a3c26 WatchSource:0}: Error finding container 604db62ff1ff06356e66933cc436d0587dbfba63fbb91420efd4d313883a3c26: Status 404 returned error can't find the container with id 604db62ff1ff06356e66933cc436d0587dbfba63fbb91420efd4d313883a3c26 Apr 20 15:02:50.610702 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:50.610607 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rm7ll" event={"ID":"94d98787-7dcd-442b-bfea-1e1cb9833889","Type":"ContainerStarted","Data":"604db62ff1ff06356e66933cc436d0587dbfba63fbb91420efd4d313883a3c26"} Apr 20 15:02:56.632961 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:56.632855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rm7ll" event={"ID":"94d98787-7dcd-442b-bfea-1e1cb9833889","Type":"ContainerStarted","Data":"e133bad02b01ff412e1a4c83aeda9c4de0f5daf17805a0c73935284ec13e4023"} Apr 20 15:02:56.632961 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:56.632922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:02:56.648676 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:02:56.648621 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-rm7ll" podStartSLOduration=0.847898873 podStartE2EDuration="6.648600673s" podCreationTimestamp="2026-04-20 15:02:50 +0000 UTC" firstStartedPulling="2026-04-20 15:02:50.477611419 +0000 UTC m=+562.208929685" lastFinishedPulling="2026-04-20 15:02:56.27831322 +0000 UTC m=+568.009631485" observedRunningTime="2026-04-20 15:02:56.646927394 +0000 UTC m=+568.378245678" watchObservedRunningTime="2026-04-20 15:02:56.648600673 +0000 UTC m=+568.379918961" Apr 20 15:03:02.664235 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:02.664205 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-rm7ll" Apr 20 15:03:05.777259 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.777224 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-68tjq"] Apr 20 15:03:05.781086 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.781066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:05.783486 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.783468 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-nb4wt\"" Apr 20 15:03:05.786663 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.786639 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-68tjq"] Apr 20 15:03:05.885877 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.885835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qdf\" (UniqueName: \"kubernetes.io/projected/28621a96-751e-4f92-888d-758bef877d62-kube-api-access-f2qdf\") pod \"maas-controller-6d4c8f55f9-68tjq\" (UID: \"28621a96-751e-4f92-888d-758bef877d62\") " pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:05.924285 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.924232 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6f7cd44b76-49mfj"] Apr 20 15:03:05.927778 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.927754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:05.936149 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.936120 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6f7cd44b76-49mfj"] Apr 20 15:03:05.987220 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.987188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qdf\" (UniqueName: \"kubernetes.io/projected/28621a96-751e-4f92-888d-758bef877d62-kube-api-access-f2qdf\") pod \"maas-controller-6d4c8f55f9-68tjq\" (UID: \"28621a96-751e-4f92-888d-758bef877d62\") " pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:05.987410 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.987252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86b9l\" (UniqueName: \"kubernetes.io/projected/93f4acdc-074c-4a3e-b8ac-101b788f495c-kube-api-access-86b9l\") pod \"maas-controller-6f7cd44b76-49mfj\" (UID: \"93f4acdc-074c-4a3e-b8ac-101b788f495c\") " pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:05.994833 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:05.994807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qdf\" (UniqueName: \"kubernetes.io/projected/28621a96-751e-4f92-888d-758bef877d62-kube-api-access-f2qdf\") pod \"maas-controller-6d4c8f55f9-68tjq\" (UID: \"28621a96-751e-4f92-888d-758bef877d62\") " pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:06.064483 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.064410 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-68tjq"] Apr 20 15:03:06.064714 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.064701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:06.087088 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.087055 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6cfc774bf8-ch89k"] Apr 20 15:03:06.088698 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.088668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86b9l\" (UniqueName: \"kubernetes.io/projected/93f4acdc-074c-4a3e-b8ac-101b788f495c-kube-api-access-86b9l\") pod \"maas-controller-6f7cd44b76-49mfj\" (UID: \"93f4acdc-074c-4a3e-b8ac-101b788f495c\") " pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:06.091679 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.091659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:06.099474 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.099448 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6cfc774bf8-ch89k"] Apr 20 15:03:06.101921 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.101899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86b9l\" (UniqueName: \"kubernetes.io/projected/93f4acdc-074c-4a3e-b8ac-101b788f495c-kube-api-access-86b9l\") pod \"maas-controller-6f7cd44b76-49mfj\" (UID: \"93f4acdc-074c-4a3e-b8ac-101b788f495c\") " pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:06.190085 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.189982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pds\" (UniqueName: \"kubernetes.io/projected/49638281-592c-4066-9ac7-d16d0b5e648c-kube-api-access-m7pds\") pod \"maas-controller-6cfc774bf8-ch89k\" (UID: \"49638281-592c-4066-9ac7-d16d0b5e648c\") " pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:06.190164 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.190094 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-68tjq"] Apr 20 15:03:06.192389 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:03:06.192364 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28621a96_751e_4f92_888d_758bef877d62.slice/crio-1daa24495b80d14c28ddb1e91d5e5a45a0f9ae1188b135fc0bcd65bd8fe914cd WatchSource:0}: Error finding container 1daa24495b80d14c28ddb1e91d5e5a45a0f9ae1188b135fc0bcd65bd8fe914cd: Status 404 returned error can't find the container with id 1daa24495b80d14c28ddb1e91d5e5a45a0f9ae1188b135fc0bcd65bd8fe914cd Apr 20 15:03:06.238099 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.238065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:06.290891 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.290859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pds\" (UniqueName: \"kubernetes.io/projected/49638281-592c-4066-9ac7-d16d0b5e648c-kube-api-access-m7pds\") pod \"maas-controller-6cfc774bf8-ch89k\" (UID: \"49638281-592c-4066-9ac7-d16d0b5e648c\") " pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:06.298550 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.298491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pds\" (UniqueName: \"kubernetes.io/projected/49638281-592c-4066-9ac7-d16d0b5e648c-kube-api-access-m7pds\") pod \"maas-controller-6cfc774bf8-ch89k\" (UID: \"49638281-592c-4066-9ac7-d16d0b5e648c\") " pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:06.354312 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.354286 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6f7cd44b76-49mfj"] Apr 20 15:03:06.356305 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:03:06.356280 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f4acdc_074c_4a3e_b8ac_101b788f495c.slice/crio-0473814f9ae5a959fd97baa4491d29c8e18bdf85f5bea8a71055619c6a58b946 WatchSource:0}: Error finding container 0473814f9ae5a959fd97baa4491d29c8e18bdf85f5bea8a71055619c6a58b946: Status 404 returned error can't find the container with id 0473814f9ae5a959fd97baa4491d29c8e18bdf85f5bea8a71055619c6a58b946 Apr 20 15:03:06.404129 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.404099 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:06.522235 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.522211 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6cfc774bf8-ch89k"] Apr 20 15:03:06.524338 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:03:06.524307 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49638281_592c_4066_9ac7_d16d0b5e648c.slice/crio-526460797bfc809eadbd2bfbbdfb9f5fe76a7fb95e5f90c5a671e3ba6df220c4 WatchSource:0}: Error finding container 526460797bfc809eadbd2bfbbdfb9f5fe76a7fb95e5f90c5a671e3ba6df220c4: Status 404 returned error can't find the container with id 526460797bfc809eadbd2bfbbdfb9f5fe76a7fb95e5f90c5a671e3ba6df220c4 Apr 20 15:03:06.667248 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.667154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" event={"ID":"49638281-592c-4066-9ac7-d16d0b5e648c","Type":"ContainerStarted","Data":"526460797bfc809eadbd2bfbbdfb9f5fe76a7fb95e5f90c5a671e3ba6df220c4"} Apr 20 15:03:06.668477 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.668439 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" event={"ID":"28621a96-751e-4f92-888d-758bef877d62","Type":"ContainerStarted","Data":"1daa24495b80d14c28ddb1e91d5e5a45a0f9ae1188b135fc0bcd65bd8fe914cd"} Apr 20 15:03:06.669994 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:06.669969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" event={"ID":"93f4acdc-074c-4a3e-b8ac-101b788f495c","Type":"ContainerStarted","Data":"0473814f9ae5a959fd97baa4491d29c8e18bdf85f5bea8a71055619c6a58b946"} Apr 20 15:03:10.685691 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.685658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" event={"ID":"49638281-592c-4066-9ac7-d16d0b5e648c","Type":"ContainerStarted","Data":"3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757"} Apr 20 15:03:10.686117 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.685905 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:10.687113 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.687080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" event={"ID":"28621a96-751e-4f92-888d-758bef877d62","Type":"ContainerStarted","Data":"2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58"} Apr 20 15:03:10.687227 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.687113 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" podUID="28621a96-751e-4f92-888d-758bef877d62" containerName="manager" containerID="cri-o://2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58" gracePeriod=10 Apr 20 15:03:10.687227 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.687128 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:10.688301 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.688277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" event={"ID":"93f4acdc-074c-4a3e-b8ac-101b788f495c","Type":"ContainerStarted","Data":"2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64"} Apr 20 15:03:10.688424 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.688413 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:10.742103 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.742057 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" podStartSLOduration=1.884244057 podStartE2EDuration="5.742041543s" podCreationTimestamp="2026-04-20 15:03:05 +0000 UTC" firstStartedPulling="2026-04-20 15:03:06.357674238 +0000 UTC m=+578.088992505" lastFinishedPulling="2026-04-20 15:03:10.215471726 +0000 UTC m=+581.946789991" observedRunningTime="2026-04-20 15:03:10.74090165 +0000 UTC m=+582.472219938" watchObservedRunningTime="2026-04-20 15:03:10.742041543 +0000 UTC m=+582.473359830" Apr 20 15:03:10.742978 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.742939 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" podStartSLOduration=1.041275516 podStartE2EDuration="4.742928104s" podCreationTimestamp="2026-04-20 15:03:06 +0000 UTC" firstStartedPulling="2026-04-20 15:03:06.525613163 +0000 UTC m=+578.256931430" lastFinishedPulling="2026-04-20 15:03:10.227265738 +0000 UTC m=+581.958584018" observedRunningTime="2026-04-20 15:03:10.709413842 +0000 UTC m=+582.440732130" watchObservedRunningTime="2026-04-20 15:03:10.742928104 +0000 UTC m=+582.474246392" Apr 20 15:03:10.780334 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.780275 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" podStartSLOduration=1.813324062 podStartE2EDuration="5.780261225s" podCreationTimestamp="2026-04-20 15:03:05 +0000 UTC" firstStartedPulling="2026-04-20 15:03:06.193636575 +0000 UTC m=+577.924954842" lastFinishedPulling="2026-04-20 15:03:10.160573729 +0000 UTC m=+581.891892005" observedRunningTime="2026-04-20 15:03:10.777644774 +0000 UTC m=+582.508963061" watchObservedRunningTime="2026-04-20 15:03:10.780261225 +0000 UTC m=+582.511579512" Apr 20 15:03:10.930407 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:10.930384 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:11.038012 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.037976 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2qdf\" (UniqueName: \"kubernetes.io/projected/28621a96-751e-4f92-888d-758bef877d62-kube-api-access-f2qdf\") pod \"28621a96-751e-4f92-888d-758bef877d62\" (UID: \"28621a96-751e-4f92-888d-758bef877d62\") " Apr 20 15:03:11.040175 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.040145 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28621a96-751e-4f92-888d-758bef877d62-kube-api-access-f2qdf" (OuterVolumeSpecName: "kube-api-access-f2qdf") pod "28621a96-751e-4f92-888d-758bef877d62" (UID: "28621a96-751e-4f92-888d-758bef877d62"). InnerVolumeSpecName "kube-api-access-f2qdf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:11.139067 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.139028 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f2qdf\" (UniqueName: \"kubernetes.io/projected/28621a96-751e-4f92-888d-758bef877d62-kube-api-access-f2qdf\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:03:11.693066 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.693030 2575 generic.go:358] "Generic (PLEG): container finished" podID="28621a96-751e-4f92-888d-758bef877d62" containerID="2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58" exitCode=0 Apr 20 15:03:11.693487 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.693087 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" Apr 20 15:03:11.693487 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.693115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" event={"ID":"28621a96-751e-4f92-888d-758bef877d62","Type":"ContainerDied","Data":"2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58"} Apr 20 15:03:11.693487 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.693154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-68tjq" event={"ID":"28621a96-751e-4f92-888d-758bef877d62","Type":"ContainerDied","Data":"1daa24495b80d14c28ddb1e91d5e5a45a0f9ae1188b135fc0bcd65bd8fe914cd"} Apr 20 15:03:11.693487 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.693175 2575 scope.go:117] "RemoveContainer" containerID="2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58" Apr 20 15:03:11.701505 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.701485 2575 scope.go:117] "RemoveContainer" containerID="2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58" Apr 20 15:03:11.701788 ip-10-0-129-82 kubenswrapper[2575]: E0420 15:03:11.701769 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58\": container with ID starting with 2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58 not found: ID does not exist" containerID="2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58" Apr 20 15:03:11.701851 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.701798 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58"} err="failed to get container status \"2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58\": rpc error: code = NotFound desc = could not find container \"2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58\": container with ID starting with 2b7cb198aa3e32a825bf0ca2c253fd2371b0d6bf328e1605755e0cb991b40e58 not found: ID does not exist" Apr 20 15:03:11.712666 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.712637 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-68tjq"] Apr 20 15:03:11.718543 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:11.718493 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-68tjq"] Apr 20 15:03:12.864843 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:12.864814 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28621a96-751e-4f92-888d-758bef877d62" path="/var/lib/kubelet/pods/28621a96-751e-4f92-888d-758bef877d62/volumes" Apr 20 15:03:21.698071 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:21.698040 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:21.698963 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:21.698944 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:21.734906 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:21.734870 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6f7cd44b76-49mfj"] Apr 20 15:03:21.735139 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:21.735100 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" podUID="93f4acdc-074c-4a3e-b8ac-101b788f495c" containerName="manager" containerID="cri-o://2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64" gracePeriod=10 Apr 20 15:03:21.984956 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:21.984930 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:22.046115 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.046071 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f96598d7b-zfzzp"] Apr 20 15:03:22.046479 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.046464 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f4acdc-074c-4a3e-b8ac-101b788f495c" containerName="manager" Apr 20 15:03:22.046479 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.046479 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f4acdc-074c-4a3e-b8ac-101b788f495c" containerName="manager" Apr 20 15:03:22.046646 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.046489 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28621a96-751e-4f92-888d-758bef877d62" containerName="manager" Apr 20 15:03:22.046646 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.046495 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="28621a96-751e-4f92-888d-758bef877d62" containerName="manager" Apr 20 15:03:22.046646 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.046579 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f4acdc-074c-4a3e-b8ac-101b788f495c" containerName="manager" Apr 20 15:03:22.046646 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.046591 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="28621a96-751e-4f92-888d-758bef877d62" containerName="manager" Apr 20 15:03:22.049601 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.049574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:03:22.057281 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.057253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f96598d7b-zfzzp"] Apr 20 15:03:22.138265 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.138223 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86b9l\" (UniqueName: \"kubernetes.io/projected/93f4acdc-074c-4a3e-b8ac-101b788f495c-kube-api-access-86b9l\") pod \"93f4acdc-074c-4a3e-b8ac-101b788f495c\" (UID: \"93f4acdc-074c-4a3e-b8ac-101b788f495c\") " Apr 20 15:03:22.138460 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.138443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp7w9\" (UniqueName: \"kubernetes.io/projected/31e1b91d-8e9a-44de-b47e-bf47227c7b1f-kube-api-access-jp7w9\") pod \"maas-controller-f96598d7b-zfzzp\" (UID: \"31e1b91d-8e9a-44de-b47e-bf47227c7b1f\") " pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:03:22.140288 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.140254 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f4acdc-074c-4a3e-b8ac-101b788f495c-kube-api-access-86b9l" (OuterVolumeSpecName: "kube-api-access-86b9l") pod "93f4acdc-074c-4a3e-b8ac-101b788f495c" (UID: "93f4acdc-074c-4a3e-b8ac-101b788f495c"). InnerVolumeSpecName "kube-api-access-86b9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:22.239646 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.239554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp7w9\" (UniqueName: \"kubernetes.io/projected/31e1b91d-8e9a-44de-b47e-bf47227c7b1f-kube-api-access-jp7w9\") pod \"maas-controller-f96598d7b-zfzzp\" (UID: \"31e1b91d-8e9a-44de-b47e-bf47227c7b1f\") " pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:03:22.239799 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.239661 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-86b9l\" (UniqueName: \"kubernetes.io/projected/93f4acdc-074c-4a3e-b8ac-101b788f495c-kube-api-access-86b9l\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:03:22.247862 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.247833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp7w9\" (UniqueName: \"kubernetes.io/projected/31e1b91d-8e9a-44de-b47e-bf47227c7b1f-kube-api-access-jp7w9\") pod \"maas-controller-f96598d7b-zfzzp\" (UID: \"31e1b91d-8e9a-44de-b47e-bf47227c7b1f\") " pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:03:22.362828 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.362789 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:03:22.485547 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.485498 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f96598d7b-zfzzp"] Apr 20 15:03:22.488899 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:03:22.488871 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e1b91d_8e9a_44de_b47e_bf47227c7b1f.slice/crio-357e082e521938e7ffb01c0ae60e6547e363f3774041b79b6a89270e4edf3517 WatchSource:0}: Error finding container 357e082e521938e7ffb01c0ae60e6547e363f3774041b79b6a89270e4edf3517: Status 404 returned error can't find the container with id 357e082e521938e7ffb01c0ae60e6547e363f3774041b79b6a89270e4edf3517 Apr 20 15:03:22.735155 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.735119 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f96598d7b-zfzzp" event={"ID":"31e1b91d-8e9a-44de-b47e-bf47227c7b1f","Type":"ContainerStarted","Data":"357e082e521938e7ffb01c0ae60e6547e363f3774041b79b6a89270e4edf3517"} Apr 20 15:03:22.736255 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.736229 2575 generic.go:358] "Generic (PLEG): container finished" podID="93f4acdc-074c-4a3e-b8ac-101b788f495c" containerID="2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64" exitCode=0 Apr 20 15:03:22.736362 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.736265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" event={"ID":"93f4acdc-074c-4a3e-b8ac-101b788f495c","Type":"ContainerDied","Data":"2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64"} Apr 20 15:03:22.736362 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.736288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" event={"ID":"93f4acdc-074c-4a3e-b8ac-101b788f495c","Type":"ContainerDied","Data":"0473814f9ae5a959fd97baa4491d29c8e18bdf85f5bea8a71055619c6a58b946"} Apr 20 15:03:22.736362 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.736299 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6f7cd44b76-49mfj" Apr 20 15:03:22.736462 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.736303 2575 scope.go:117] "RemoveContainer" containerID="2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64" Apr 20 15:03:22.745284 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.745256 2575 scope.go:117] "RemoveContainer" containerID="2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64" Apr 20 15:03:22.745580 ip-10-0-129-82 kubenswrapper[2575]: E0420 15:03:22.745561 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64\": container with ID starting with 2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64 not found: ID does not exist" containerID="2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64" Apr 20 15:03:22.745645 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.745588 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64"} err="failed to get container status \"2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64\": rpc error: code = NotFound desc = could not find container \"2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64\": container with ID starting with 2cbb21d478386b4bd773910491c9962bf64d71ce67c69a8b35d08a038d7ecd64 not found: ID does not exist" Apr 20 15:03:22.767909 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.767879 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6f7cd44b76-49mfj"] Apr 20 15:03:22.769863 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.769834 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6f7cd44b76-49mfj"] Apr 20 15:03:22.865653 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:22.865621 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f4acdc-074c-4a3e-b8ac-101b788f495c" path="/var/lib/kubelet/pods/93f4acdc-074c-4a3e-b8ac-101b788f495c/volumes" Apr 20 15:03:23.741591 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:23.741556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f96598d7b-zfzzp" event={"ID":"31e1b91d-8e9a-44de-b47e-bf47227c7b1f","Type":"ContainerStarted","Data":"fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f"} Apr 20 15:03:23.742027 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:23.741750 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:03:23.757096 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:23.757047 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f96598d7b-zfzzp" podStartSLOduration=1.2925687639999999 podStartE2EDuration="1.757034229s" podCreationTimestamp="2026-04-20 15:03:22 +0000 UTC" firstStartedPulling="2026-04-20 15:03:22.490143346 +0000 UTC m=+594.221461616" lastFinishedPulling="2026-04-20 15:03:22.954608804 +0000 UTC m=+594.685927081" observedRunningTime="2026-04-20 15:03:23.755369696 +0000 UTC m=+595.486687983" watchObservedRunningTime="2026-04-20 15:03:23.757034229 +0000 UTC m=+595.488352517" Apr 20 15:03:28.817652 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:28.817192 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:03:28.817652 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:28.817394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:03:28.823527 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:28.823488 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:03:28.823640 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:28.823626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:03:29.252021 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.251974 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-66744456cc-w6qjh"] Apr 20 15:03:29.278266 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.278233 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-66744456cc-w6qjh"] Apr 20 15:03:29.278432 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.278357 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:29.280817 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.280789 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 15:03:29.281095 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.281080 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 15:03:29.310010 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.309972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls\") pod \"maas-api-66744456cc-w6qjh\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:29.310161 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.310019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvlvf\" (UniqueName: \"kubernetes.io/projected/720b8ce8-54f9-459d-b0bf-e89728715adc-kube-api-access-lvlvf\") pod \"maas-api-66744456cc-w6qjh\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:29.411334 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.411290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls\") pod \"maas-api-66744456cc-w6qjh\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:29.411334 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.411335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvlvf\" (UniqueName: \"kubernetes.io/projected/720b8ce8-54f9-459d-b0bf-e89728715adc-kube-api-access-lvlvf\") pod \"maas-api-66744456cc-w6qjh\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:29.411613 ip-10-0-129-82 kubenswrapper[2575]: E0420 15:03:29.411489 2575 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 20 15:03:29.411613 ip-10-0-129-82 kubenswrapper[2575]: E0420 15:03:29.411608 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls podName:720b8ce8-54f9-459d-b0bf-e89728715adc nodeName:}" failed. No retries permitted until 2026-04-20 15:03:29.911583994 +0000 UTC m=+601.642902272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls") pod "maas-api-66744456cc-w6qjh" (UID: "720b8ce8-54f9-459d-b0bf-e89728715adc") : secret "maas-api-serving-cert" not found Apr 20 15:03:29.420605 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.420561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvlvf\" (UniqueName: \"kubernetes.io/projected/720b8ce8-54f9-459d-b0bf-e89728715adc-kube-api-access-lvlvf\") pod \"maas-api-66744456cc-w6qjh\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:29.916600 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.916555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls\") pod \"maas-api-66744456cc-w6qjh\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:29.919089 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:29.919066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls\") pod \"maas-api-66744456cc-w6qjh\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:30.191448 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:30.191348 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:30.321678 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:30.321647 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-66744456cc-w6qjh"] Apr 20 15:03:30.325468 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:03:30.325439 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720b8ce8_54f9_459d_b0bf_e89728715adc.slice/crio-1cb3fc4a3bbf98341998796fdca384720af397b7f29b65294e4deebea669da93 WatchSource:0}: Error finding container 1cb3fc4a3bbf98341998796fdca384720af397b7f29b65294e4deebea669da93: Status 404 returned error can't find the container with id 1cb3fc4a3bbf98341998796fdca384720af397b7f29b65294e4deebea669da93 Apr 20 15:03:30.768826 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:30.768788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66744456cc-w6qjh" event={"ID":"720b8ce8-54f9-459d-b0bf-e89728715adc","Type":"ContainerStarted","Data":"1cb3fc4a3bbf98341998796fdca384720af397b7f29b65294e4deebea669da93"} Apr 20 15:03:32.778325 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:32.778290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66744456cc-w6qjh" event={"ID":"720b8ce8-54f9-459d-b0bf-e89728715adc","Type":"ContainerStarted","Data":"330a23edccaefb8690d61c45c320db411bd9d4c06f82dbc5b061e069ba3cf05e"} Apr 20 15:03:32.778781 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:32.778412 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:32.795327 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:32.795276 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-66744456cc-w6qjh" podStartSLOduration=2.328183594 podStartE2EDuration="3.795260695s" podCreationTimestamp="2026-04-20 15:03:29 +0000 UTC" firstStartedPulling="2026-04-20 15:03:30.326762324 +0000 UTC m=+602.058080592" lastFinishedPulling="2026-04-20 15:03:31.793839424 +0000 UTC m=+603.525157693" observedRunningTime="2026-04-20 15:03:32.792896323 +0000 UTC m=+604.524214623" watchObservedRunningTime="2026-04-20 15:03:32.795260695 +0000 UTC m=+604.526579017" Apr 20 15:03:34.750990 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:34.750954 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:03:34.798092 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:34.798057 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6cfc774bf8-ch89k"] Apr 20 15:03:34.798665 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:34.798630 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" podUID="49638281-592c-4066-9ac7-d16d0b5e648c" containerName="manager" containerID="cri-o://3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757" gracePeriod=10 Apr 20 15:03:35.044823 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.044800 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:35.165338 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.165300 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7pds\" (UniqueName: \"kubernetes.io/projected/49638281-592c-4066-9ac7-d16d0b5e648c-kube-api-access-m7pds\") pod \"49638281-592c-4066-9ac7-d16d0b5e648c\" (UID: \"49638281-592c-4066-9ac7-d16d0b5e648c\") " Apr 20 15:03:35.167307 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.167274 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49638281-592c-4066-9ac7-d16d0b5e648c-kube-api-access-m7pds" (OuterVolumeSpecName: "kube-api-access-m7pds") pod "49638281-592c-4066-9ac7-d16d0b5e648c" (UID: "49638281-592c-4066-9ac7-d16d0b5e648c"). InnerVolumeSpecName "kube-api-access-m7pds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:35.266383 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.266342 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7pds\" (UniqueName: \"kubernetes.io/projected/49638281-592c-4066-9ac7-d16d0b5e648c-kube-api-access-m7pds\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:03:35.790392 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.790352 2575 generic.go:358] "Generic (PLEG): container finished" podID="49638281-592c-4066-9ac7-d16d0b5e648c" containerID="3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757" exitCode=0 Apr 20 15:03:35.790804 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.790408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" event={"ID":"49638281-592c-4066-9ac7-d16d0b5e648c","Type":"ContainerDied","Data":"3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757"} Apr 20 15:03:35.790804 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.790441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" event={"ID":"49638281-592c-4066-9ac7-d16d0b5e648c","Type":"ContainerDied","Data":"526460797bfc809eadbd2bfbbdfb9f5fe76a7fb95e5f90c5a671e3ba6df220c4"} Apr 20 15:03:35.790804 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.790456 2575 scope.go:117] "RemoveContainer" containerID="3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757" Apr 20 15:03:35.790804 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.790412 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cfc774bf8-ch89k" Apr 20 15:03:35.799096 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.799074 2575 scope.go:117] "RemoveContainer" containerID="3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757" Apr 20 15:03:35.799368 ip-10-0-129-82 kubenswrapper[2575]: E0420 15:03:35.799347 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757\": container with ID starting with 3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757 not found: ID does not exist" containerID="3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757" Apr 20 15:03:35.799416 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.799377 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757"} err="failed to get container status \"3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757\": rpc error: code = NotFound desc = could not find container \"3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757\": container with ID starting with 3558cea8ff331209869c78d9ab892bf69f0c077fe86c7c4e98f3a04d07af1757 not found: ID does not exist" Apr 20 15:03:35.810604 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.810576 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6cfc774bf8-ch89k"] Apr 20 15:03:35.812116 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:35.812082 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6cfc774bf8-ch89k"] Apr 20 15:03:36.865014 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:36.864982 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49638281-592c-4066-9ac7-d16d0b5e648c" path="/var/lib/kubelet/pods/49638281-592c-4066-9ac7-d16d0b5e648c/volumes" Apr 20 15:03:38.787112 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:38.787078 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:03:55.739813 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.739777 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br"] Apr 20 15:03:55.740294 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.740138 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49638281-592c-4066-9ac7-d16d0b5e648c" containerName="manager" Apr 20 15:03:55.740294 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.740150 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="49638281-592c-4066-9ac7-d16d0b5e648c" containerName="manager" Apr 20 15:03:55.740294 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.740230 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="49638281-592c-4066-9ac7-d16d0b5e648c" containerName="manager" Apr 20 15:03:55.743423 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.743399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.746671 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.746645 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 15:03:55.746845 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.746647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-t5pz2\"" Apr 20 15:03:55.746845 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.746648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 15:03:55.746845 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.746648 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 15:03:55.754209 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.754183 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br"] Apr 20 15:03:55.855910 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.855863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.855910 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.855914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.856152 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.855940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5vr\" (UniqueName: \"kubernetes.io/projected/fb668fe3-743d-479c-807b-ee99d9df6fa0-kube-api-access-jw5vr\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.856152 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.855977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.856152 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.856002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb668fe3-743d-479c-807b-ee99d9df6fa0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.856152 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.856093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957165 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957347 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957347 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5vr\" (UniqueName: \"kubernetes.io/projected/fb668fe3-743d-479c-807b-ee99d9df6fa0-kube-api-access-jw5vr\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957461 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957552 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb668fe3-743d-479c-807b-ee99d9df6fa0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957611 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957611 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957693 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.957822 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.957805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.959808 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.959782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb668fe3-743d-479c-807b-ee99d9df6fa0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.960106 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.960086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb668fe3-743d-479c-807b-ee99d9df6fa0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:55.964751 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:55.964729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5vr\" (UniqueName: \"kubernetes.io/projected/fb668fe3-743d-479c-807b-ee99d9df6fa0-kube-api-access-jw5vr\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br\" (UID: \"fb668fe3-743d-479c-807b-ee99d9df6fa0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:56.054955 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:56.054861 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:03:56.188969 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:56.188937 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br"] Apr 20 15:03:56.191569 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:03:56.191540 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb668fe3_743d_479c_807b_ee99d9df6fa0.slice/crio-4aa9bc20f2b4c3df4345eed6d33dcfe053efcbdb97807ed429206306dd02113d WatchSource:0}: Error finding container 4aa9bc20f2b4c3df4345eed6d33dcfe053efcbdb97807ed429206306dd02113d: Status 404 returned error can't find the container with id 4aa9bc20f2b4c3df4345eed6d33dcfe053efcbdb97807ed429206306dd02113d Apr 20 15:03:56.868014 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:03:56.867980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" event={"ID":"fb668fe3-743d-479c-807b-ee99d9df6fa0","Type":"ContainerStarted","Data":"4aa9bc20f2b4c3df4345eed6d33dcfe053efcbdb97807ed429206306dd02113d"} Apr 20 15:04:03.649063 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:03.649023 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-66744456cc-w6qjh"] Apr 20 15:04:03.675807 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:03.649279 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-66744456cc-w6qjh" podUID="720b8ce8-54f9-459d-b0bf-e89728715adc" containerName="maas-api" containerID="cri-o://330a23edccaefb8690d61c45c320db411bd9d4c06f82dbc5b061e069ba3cf05e" gracePeriod=30 Apr 20 15:04:03.782321 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:03.782252 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="opendatahub/maas-api-66744456cc-w6qjh" podUID="720b8ce8-54f9-459d-b0bf-e89728715adc" containerName="maas-api" probeResult="failure" output="Get \"https://10.132.0.34:8443/health\": dial tcp 10.132.0.34:8443: connect: connection refused" Apr 20 15:04:04.899796 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:04.899757 2575 generic.go:358] "Generic (PLEG): container finished" podID="720b8ce8-54f9-459d-b0bf-e89728715adc" containerID="330a23edccaefb8690d61c45c320db411bd9d4c06f82dbc5b061e069ba3cf05e" exitCode=0 Apr 20 15:04:04.900129 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:04.899805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66744456cc-w6qjh" event={"ID":"720b8ce8-54f9-459d-b0bf-e89728715adc","Type":"ContainerDied","Data":"330a23edccaefb8690d61c45c320db411bd9d4c06f82dbc5b061e069ba3cf05e"} Apr 20 15:04:04.985282 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:04.985258 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:04:05.142619 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.142501 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvlvf\" (UniqueName: \"kubernetes.io/projected/720b8ce8-54f9-459d-b0bf-e89728715adc-kube-api-access-lvlvf\") pod \"720b8ce8-54f9-459d-b0bf-e89728715adc\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " Apr 20 15:04:05.142619 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.142585 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls\") pod \"720b8ce8-54f9-459d-b0bf-e89728715adc\" (UID: \"720b8ce8-54f9-459d-b0bf-e89728715adc\") " Apr 20 15:04:05.144740 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.144708 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720b8ce8-54f9-459d-b0bf-e89728715adc-kube-api-access-lvlvf" (OuterVolumeSpecName: "kube-api-access-lvlvf") pod "720b8ce8-54f9-459d-b0bf-e89728715adc" (UID: "720b8ce8-54f9-459d-b0bf-e89728715adc"). InnerVolumeSpecName "kube-api-access-lvlvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:04:05.144851 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.144744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "720b8ce8-54f9-459d-b0bf-e89728715adc" (UID: "720b8ce8-54f9-459d-b0bf-e89728715adc"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:04:05.244340 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.244291 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvlvf\" (UniqueName: \"kubernetes.io/projected/720b8ce8-54f9-459d-b0bf-e89728715adc-kube-api-access-lvlvf\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:04:05.244340 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.244335 2575 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/720b8ce8-54f9-459d-b0bf-e89728715adc-maas-api-tls\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:04:05.905091 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.905060 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66744456cc-w6qjh" Apr 20 15:04:05.905547 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.905084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66744456cc-w6qjh" event={"ID":"720b8ce8-54f9-459d-b0bf-e89728715adc","Type":"ContainerDied","Data":"1cb3fc4a3bbf98341998796fdca384720af397b7f29b65294e4deebea669da93"} Apr 20 15:04:05.905547 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.905144 2575 scope.go:117] "RemoveContainer" containerID="330a23edccaefb8690d61c45c320db411bd9d4c06f82dbc5b061e069ba3cf05e" Apr 20 15:04:05.906806 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.906769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" event={"ID":"fb668fe3-743d-479c-807b-ee99d9df6fa0","Type":"ContainerStarted","Data":"527b4a263b0b75e24abc4851644ccac4c9f4ecaf6418a06d9e3de09355797057"} Apr 20 15:04:05.933957 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.933918 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-66744456cc-w6qjh"] Apr 20 15:04:05.935996 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:05.935966 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-66744456cc-w6qjh"] Apr 20 15:04:06.865988 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:06.865946 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720b8ce8-54f9-459d-b0bf-e89728715adc" path="/var/lib/kubelet/pods/720b8ce8-54f9-459d-b0bf-e89728715adc/volumes" Apr 20 15:04:10.927320 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:10.927289 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb668fe3-743d-479c-807b-ee99d9df6fa0" containerID="527b4a263b0b75e24abc4851644ccac4c9f4ecaf6418a06d9e3de09355797057" exitCode=0 Apr 20 15:04:10.927729 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:10.927372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" event={"ID":"fb668fe3-743d-479c-807b-ee99d9df6fa0","Type":"ContainerDied","Data":"527b4a263b0b75e24abc4851644ccac4c9f4ecaf6418a06d9e3de09355797057"} Apr 20 15:04:12.127705 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.127670 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw"] Apr 20 15:04:12.128231 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.128143 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="720b8ce8-54f9-459d-b0bf-e89728715adc" containerName="maas-api" Apr 20 15:04:12.128231 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.128159 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="720b8ce8-54f9-459d-b0bf-e89728715adc" containerName="maas-api" Apr 20 15:04:12.128231 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.128211 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="720b8ce8-54f9-459d-b0bf-e89728715adc" containerName="maas-api" Apr 20 15:04:12.131347 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.131323 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.133765 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.133728 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 15:04:12.140914 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.140881 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw"] Apr 20 15:04:12.210454 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.210410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twll8\" (UniqueName: \"kubernetes.io/projected/8452376a-6fe6-4616-b5f0-3702f95b1d4b-kube-api-access-twll8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.210663 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.210468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8452376a-6fe6-4616-b5f0-3702f95b1d4b-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.210663 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.210573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.210663 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.210609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.210663 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.210654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.210796 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.210683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312035 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.311931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312035 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.312000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312240 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.312059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twll8\" (UniqueName: \"kubernetes.io/projected/8452376a-6fe6-4616-b5f0-3702f95b1d4b-kube-api-access-twll8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312240 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.312094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8452376a-6fe6-4616-b5f0-3702f95b1d4b-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312240 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.312141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312240 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.312189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312858 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.312368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.312994 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.312970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.313052 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.313037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.314950 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.314926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8452376a-6fe6-4616-b5f0-3702f95b1d4b-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.315306 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.315286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8452376a-6fe6-4616-b5f0-3702f95b1d4b-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.320190 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.320163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twll8\" (UniqueName: \"kubernetes.io/projected/8452376a-6fe6-4616-b5f0-3702f95b1d4b-kube-api-access-twll8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-nvdqw\" (UID: \"8452376a-6fe6-4616-b5f0-3702f95b1d4b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.443905 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.443855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:12.597965 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.597941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw"] Apr 20 15:04:12.599864 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:04:12.599835 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8452376a_6fe6_4616_b5f0_3702f95b1d4b.slice/crio-e2b2de9d05ea8922d3b643cf274eaf04c1edf237491bb02fc2612534f586d3af WatchSource:0}: Error finding container e2b2de9d05ea8922d3b643cf274eaf04c1edf237491bb02fc2612534f586d3af: Status 404 returned error can't find the container with id e2b2de9d05ea8922d3b643cf274eaf04c1edf237491bb02fc2612534f586d3af Apr 20 15:04:12.935534 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.935415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" event={"ID":"8452376a-6fe6-4616-b5f0-3702f95b1d4b","Type":"ContainerStarted","Data":"f9e5ca08121e7d61ab343a142c6b4e29aebd577ca11ed588189ecedd6d08ca56"} Apr 20 15:04:12.935534 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.935462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" event={"ID":"8452376a-6fe6-4616-b5f0-3702f95b1d4b","Type":"ContainerStarted","Data":"e2b2de9d05ea8922d3b643cf274eaf04c1edf237491bb02fc2612534f586d3af"} Apr 20 15:04:12.937129 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.937103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" event={"ID":"fb668fe3-743d-479c-807b-ee99d9df6fa0","Type":"ContainerStarted","Data":"860ad7a9a6f2047a2fa7946b05681e90c51cc6fe0f0ea3be746f2af6faa60aa0"} Apr 20 15:04:12.937372 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.937352 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:04:12.967488 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:12.967422 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" podStartSLOduration=2.116806547 podStartE2EDuration="17.96740205s" podCreationTimestamp="2026-04-20 15:03:55 +0000 UTC" firstStartedPulling="2026-04-20 15:03:56.193432467 +0000 UTC m=+627.924750733" lastFinishedPulling="2026-04-20 15:04:12.044027967 +0000 UTC m=+643.775346236" observedRunningTime="2026-04-20 15:04:12.965587128 +0000 UTC m=+644.696905418" watchObservedRunningTime="2026-04-20 15:04:12.96740205 +0000 UTC m=+644.698720338" Apr 20 15:04:18.959438 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:18.959400 2575 generic.go:358] "Generic (PLEG): container finished" podID="8452376a-6fe6-4616-b5f0-3702f95b1d4b" containerID="f9e5ca08121e7d61ab343a142c6b4e29aebd577ca11ed588189ecedd6d08ca56" exitCode=0 Apr 20 15:04:18.959868 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:18.959475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" event={"ID":"8452376a-6fe6-4616-b5f0-3702f95b1d4b","Type":"ContainerDied","Data":"f9e5ca08121e7d61ab343a142c6b4e29aebd577ca11ed588189ecedd6d08ca56"} Apr 20 15:04:19.964008 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:19.963971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" event={"ID":"8452376a-6fe6-4616-b5f0-3702f95b1d4b","Type":"ContainerStarted","Data":"1f26ed6fee720447668864d6d8a2d658dbc74dbf6fab9dccc1345f1befaa4707"} Apr 20 15:04:19.964398 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:19.964189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:19.988211 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:19.988137 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" podStartSLOduration=7.686676293 podStartE2EDuration="7.988122742s" podCreationTimestamp="2026-04-20 15:04:12 +0000 UTC" firstStartedPulling="2026-04-20 15:04:18.960075877 +0000 UTC m=+650.691394142" lastFinishedPulling="2026-04-20 15:04:19.261522311 +0000 UTC m=+650.992840591" observedRunningTime="2026-04-20 15:04:19.987496221 +0000 UTC m=+651.718814517" watchObservedRunningTime="2026-04-20 15:04:19.988122742 +0000 UTC m=+651.719441030" Apr 20 15:04:23.955305 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:23.955275 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br" Apr 20 15:04:25.032422 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.032388 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq"] Apr 20 15:04:25.039270 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.039246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.041649 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.041625 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 15:04:25.045194 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.045169 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq"] Apr 20 15:04:25.140871 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.140828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.141056 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.140887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.141056 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.140912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.141056 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.140932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9rvw\" (UniqueName: \"kubernetes.io/projected/57b489ab-a8d9-4a73-b1cb-f77423937f72-kube-api-access-h9rvw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.141056 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.141034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57b489ab-a8d9-4a73-b1cb-f77423937f72-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.141201 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.141067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242248 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242248 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242498 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9rvw\" (UniqueName: \"kubernetes.io/projected/57b489ab-a8d9-4a73-b1cb-f77423937f72-kube-api-access-h9rvw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242498 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57b489ab-a8d9-4a73-b1cb-f77423937f72-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242498 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242498 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242732 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242797 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.242841 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.242816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.244652 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.244630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/57b489ab-a8d9-4a73-b1cb-f77423937f72-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.244945 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.244929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/57b489ab-a8d9-4a73-b1cb-f77423937f72-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.250442 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.250416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9rvw\" (UniqueName: \"kubernetes.io/projected/57b489ab-a8d9-4a73-b1cb-f77423937f72-kube-api-access-h9rvw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq\" (UID: \"57b489ab-a8d9-4a73-b1cb-f77423937f72\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.351208 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.351114 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:25.476976 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.476947 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq"] Apr 20 15:04:25.479005 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:04:25.478969 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b489ab_a8d9_4a73_b1cb_f77423937f72.slice/crio-bc847d2189ebd5f63cee2143d79c49518fd8089da4a46c5af4dfd6346570ffbe WatchSource:0}: Error finding container bc847d2189ebd5f63cee2143d79c49518fd8089da4a46c5af4dfd6346570ffbe: Status 404 returned error can't find the container with id bc847d2189ebd5f63cee2143d79c49518fd8089da4a46c5af4dfd6346570ffbe Apr 20 15:04:25.481526 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.481494 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:04:25.988275 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.988235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" event={"ID":"57b489ab-a8d9-4a73-b1cb-f77423937f72","Type":"ContainerStarted","Data":"1ee5b5eefe4a806ebf3191e2702bcadf8255ac9b6e1167d69f5a2bfb037397c8"} Apr 20 15:04:25.988275 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:25.988281 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" event={"ID":"57b489ab-a8d9-4a73-b1cb-f77423937f72","Type":"ContainerStarted","Data":"bc847d2189ebd5f63cee2143d79c49518fd8089da4a46c5af4dfd6346570ffbe"} Apr 20 15:04:30.980612 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:30.980580 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-nvdqw" Apr 20 15:04:32.011588 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:32.011554 2575 generic.go:358] "Generic (PLEG): container finished" podID="57b489ab-a8d9-4a73-b1cb-f77423937f72" containerID="1ee5b5eefe4a806ebf3191e2702bcadf8255ac9b6e1167d69f5a2bfb037397c8" exitCode=0 Apr 20 15:04:32.012060 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:32.011629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" event={"ID":"57b489ab-a8d9-4a73-b1cb-f77423937f72","Type":"ContainerDied","Data":"1ee5b5eefe4a806ebf3191e2702bcadf8255ac9b6e1167d69f5a2bfb037397c8"} Apr 20 15:04:33.016633 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.016597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" event={"ID":"57b489ab-a8d9-4a73-b1cb-f77423937f72","Type":"ContainerStarted","Data":"7ce6f7462235a65a4b91a6a91de9e334282cea8783a50ed23fa14217b000198f"} Apr 20 15:04:33.017019 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.016822 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:33.041731 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.041674 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" podStartSLOduration=7.756615227 podStartE2EDuration="8.04165866s" podCreationTimestamp="2026-04-20 15:04:25 +0000 UTC" firstStartedPulling="2026-04-20 15:04:32.01235783 +0000 UTC m=+663.743676096" lastFinishedPulling="2026-04-20 15:04:32.297401246 +0000 UTC m=+664.028719529" observedRunningTime="2026-04-20 15:04:33.039107925 +0000 UTC m=+664.770426210" watchObservedRunningTime="2026-04-20 15:04:33.04165866 +0000 UTC m=+664.772976948" Apr 20 15:04:33.250812 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.250777 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj"] Apr 20 15:04:33.254317 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.254300 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.257222 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.257198 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 15:04:33.265430 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.265405 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj"] Apr 20 15:04:33.314912 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.314807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.314912 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.314859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.315130 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.314915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.315130 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.315001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc00640a-59de-471b-b4f9-bc8725ae64b2-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.315130 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.315102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.315268 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.315156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6g9w\" (UniqueName: \"kubernetes.io/projected/dc00640a-59de-471b-b4f9-bc8725ae64b2-kube-api-access-s6g9w\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.416362 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc00640a-59de-471b-b4f9-bc8725ae64b2-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.416582 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.416582 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6g9w\" (UniqueName: \"kubernetes.io/projected/dc00640a-59de-471b-b4f9-bc8725ae64b2-kube-api-access-s6g9w\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.416582 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.416582 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.416582 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.416958 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.417025 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.416937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.417076 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.417051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.418736 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.418712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc00640a-59de-471b-b4f9-bc8725ae64b2-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.419071 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.419050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc00640a-59de-471b-b4f9-bc8725ae64b2-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.426911 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.426886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6g9w\" (UniqueName: \"kubernetes.io/projected/dc00640a-59de-471b-b4f9-bc8725ae64b2-kube-api-access-s6g9w\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj\" (UID: \"dc00640a-59de-471b-b4f9-bc8725ae64b2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.565262 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.565170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:33.691551 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:33.691495 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj"] Apr 20 15:04:33.694642 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:04:33.694596 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc00640a_59de_471b_b4f9_bc8725ae64b2.slice/crio-a92ea82e65bb970551ae4ff60dd938d3469312cfef7aab360ef043e1ae8a6404 WatchSource:0}: Error finding container a92ea82e65bb970551ae4ff60dd938d3469312cfef7aab360ef043e1ae8a6404: Status 404 returned error can't find the container with id a92ea82e65bb970551ae4ff60dd938d3469312cfef7aab360ef043e1ae8a6404 Apr 20 15:04:34.022125 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:34.022091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" event={"ID":"dc00640a-59de-471b-b4f9-bc8725ae64b2","Type":"ContainerStarted","Data":"caf25a7ded0acdab747046d0d99aa4ac0efccc6187accee777dc7e236e0195a7"} Apr 20 15:04:34.022125 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:34.022128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" event={"ID":"dc00640a-59de-471b-b4f9-bc8725ae64b2","Type":"ContainerStarted","Data":"a92ea82e65bb970551ae4ff60dd938d3469312cfef7aab360ef043e1ae8a6404"} Apr 20 15:04:40.046558 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:40.046506 2575 generic.go:358] "Generic (PLEG): container finished" podID="dc00640a-59de-471b-b4f9-bc8725ae64b2" containerID="caf25a7ded0acdab747046d0d99aa4ac0efccc6187accee777dc7e236e0195a7" exitCode=0 Apr 20 15:04:40.046938 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:40.046579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" event={"ID":"dc00640a-59de-471b-b4f9-bc8725ae64b2","Type":"ContainerDied","Data":"caf25a7ded0acdab747046d0d99aa4ac0efccc6187accee777dc7e236e0195a7"} Apr 20 15:04:41.052119 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:41.052078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" event={"ID":"dc00640a-59de-471b-b4f9-bc8725ae64b2","Type":"ContainerStarted","Data":"b7180df792c3eaabdba619f2365c88aa02a6ede081a57e3e3022738d23dfb651"} Apr 20 15:04:41.052575 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:41.052296 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:04:41.069555 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:41.069485 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" podStartSLOduration=7.509456297 podStartE2EDuration="8.06946824s" podCreationTimestamp="2026-04-20 15:04:33 +0000 UTC" firstStartedPulling="2026-04-20 15:04:40.047196309 +0000 UTC m=+671.778514574" lastFinishedPulling="2026-04-20 15:04:40.607208251 +0000 UTC m=+672.338526517" observedRunningTime="2026-04-20 15:04:41.067993749 +0000 UTC m=+672.799312038" watchObservedRunningTime="2026-04-20 15:04:41.06946824 +0000 UTC m=+672.800786555" Apr 20 15:04:44.035292 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:44.035260 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq" Apr 20 15:04:52.069558 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:04:52.069500 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj" Apr 20 15:06:46.039593 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.039501 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f96598d7b-zfzzp"] Apr 20 15:06:46.040097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.039787 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f96598d7b-zfzzp" podUID="31e1b91d-8e9a-44de-b47e-bf47227c7b1f" containerName="manager" containerID="cri-o://fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f" gracePeriod=10 Apr 20 15:06:46.274031 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.274004 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:06:46.417209 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.417121 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp7w9\" (UniqueName: \"kubernetes.io/projected/31e1b91d-8e9a-44de-b47e-bf47227c7b1f-kube-api-access-jp7w9\") pod \"31e1b91d-8e9a-44de-b47e-bf47227c7b1f\" (UID: \"31e1b91d-8e9a-44de-b47e-bf47227c7b1f\") " Apr 20 15:06:46.419199 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.419167 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e1b91d-8e9a-44de-b47e-bf47227c7b1f-kube-api-access-jp7w9" (OuterVolumeSpecName: "kube-api-access-jp7w9") pod "31e1b91d-8e9a-44de-b47e-bf47227c7b1f" (UID: "31e1b91d-8e9a-44de-b47e-bf47227c7b1f"). InnerVolumeSpecName "kube-api-access-jp7w9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:06:46.486960 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.486924 2575 generic.go:358] "Generic (PLEG): container finished" podID="31e1b91d-8e9a-44de-b47e-bf47227c7b1f" containerID="fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f" exitCode=0 Apr 20 15:06:46.486960 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.486965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f96598d7b-zfzzp" event={"ID":"31e1b91d-8e9a-44de-b47e-bf47227c7b1f","Type":"ContainerDied","Data":"fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f"} Apr 20 15:06:46.487198 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.486984 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f96598d7b-zfzzp" Apr 20 15:06:46.487198 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.487003 2575 scope.go:117] "RemoveContainer" containerID="fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f" Apr 20 15:06:46.487198 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.486991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f96598d7b-zfzzp" event={"ID":"31e1b91d-8e9a-44de-b47e-bf47227c7b1f","Type":"ContainerDied","Data":"357e082e521938e7ffb01c0ae60e6547e363f3774041b79b6a89270e4edf3517"} Apr 20 15:06:46.495114 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.495097 2575 scope.go:117] "RemoveContainer" containerID="fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f" Apr 20 15:06:46.495386 ip-10-0-129-82 kubenswrapper[2575]: E0420 15:06:46.495367 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f\": container with ID starting with fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f not found: ID does not exist" containerID="fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f" Apr 20 15:06:46.495431 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.495394 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f"} err="failed to get container status \"fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f\": rpc error: code = NotFound desc = could not find container \"fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f\": container with ID starting with fae15cf0fe47c573f0cb89f49abb9a8dbae24513f1087038ddd1368d7b77b18f not found: ID does not exist" Apr 20 15:06:46.507423 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.507395 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f96598d7b-zfzzp"] Apr 20 15:06:46.510501 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.510477 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f96598d7b-zfzzp"] Apr 20 15:06:46.518016 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.517995 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jp7w9\" (UniqueName: \"kubernetes.io/projected/31e1b91d-8e9a-44de-b47e-bf47227c7b1f-kube-api-access-jp7w9\") on node \"ip-10-0-129-82.ec2.internal\" DevicePath \"\"" Apr 20 15:06:46.864848 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:46.864814 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e1b91d-8e9a-44de-b47e-bf47227c7b1f" path="/var/lib/kubelet/pods/31e1b91d-8e9a-44de-b47e-bf47227c7b1f/volumes" Apr 20 15:06:47.237793 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.237759 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f96598d7b-t5qwc"] Apr 20 15:06:47.238161 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.238128 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31e1b91d-8e9a-44de-b47e-bf47227c7b1f" containerName="manager" Apr 20 15:06:47.238161 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.238139 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e1b91d-8e9a-44de-b47e-bf47227c7b1f" containerName="manager" Apr 20 15:06:47.238231 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.238215 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="31e1b91d-8e9a-44de-b47e-bf47227c7b1f" containerName="manager" Apr 20 15:06:47.242731 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.242709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f96598d7b-t5qwc" Apr 20 15:06:47.244863 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.244843 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-nb4wt\"" Apr 20 15:06:47.248934 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.248904 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f96598d7b-t5qwc"] Apr 20 15:06:47.325818 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.325789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnddp\" (UniqueName: \"kubernetes.io/projected/bab56dd7-c872-4589-9032-143fb2f51f53-kube-api-access-mnddp\") pod \"maas-controller-f96598d7b-t5qwc\" (UID: \"bab56dd7-c872-4589-9032-143fb2f51f53\") " pod="opendatahub/maas-controller-f96598d7b-t5qwc" Apr 20 15:06:47.427041 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.427005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnddp\" (UniqueName: \"kubernetes.io/projected/bab56dd7-c872-4589-9032-143fb2f51f53-kube-api-access-mnddp\") pod \"maas-controller-f96598d7b-t5qwc\" (UID: \"bab56dd7-c872-4589-9032-143fb2f51f53\") " pod="opendatahub/maas-controller-f96598d7b-t5qwc" Apr 20 15:06:47.435135 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.435098 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnddp\" (UniqueName: \"kubernetes.io/projected/bab56dd7-c872-4589-9032-143fb2f51f53-kube-api-access-mnddp\") pod \"maas-controller-f96598d7b-t5qwc\" (UID: \"bab56dd7-c872-4589-9032-143fb2f51f53\") " pod="opendatahub/maas-controller-f96598d7b-t5qwc" Apr 20 15:06:47.554432 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.554349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f96598d7b-t5qwc" Apr 20 15:06:47.676831 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:47.676754 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f96598d7b-t5qwc"] Apr 20 15:06:47.679240 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:06:47.679214 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbab56dd7_c872_4589_9032_143fb2f51f53.slice/crio-e351f5b0f9e0c34ff12284042ef98d5ca67995463dee57db930d451789db43a0 WatchSource:0}: Error finding container e351f5b0f9e0c34ff12284042ef98d5ca67995463dee57db930d451789db43a0: Status 404 returned error can't find the container with id e351f5b0f9e0c34ff12284042ef98d5ca67995463dee57db930d451789db43a0 Apr 20 15:06:48.496528 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:48.496482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f96598d7b-t5qwc" event={"ID":"bab56dd7-c872-4589-9032-143fb2f51f53","Type":"ContainerStarted","Data":"4dbbc214299ccef364d147852a371c34db83a75167acac09b4b6f5649ee7b0f5"} Apr 20 15:06:48.496977 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:48.496536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f96598d7b-t5qwc" event={"ID":"bab56dd7-c872-4589-9032-143fb2f51f53","Type":"ContainerStarted","Data":"e351f5b0f9e0c34ff12284042ef98d5ca67995463dee57db930d451789db43a0"} Apr 20 15:06:48.496977 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:48.496670 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f96598d7b-t5qwc" Apr 20 15:06:48.512265 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:48.512212 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f96598d7b-t5qwc" podStartSLOduration=1.071314399 podStartE2EDuration="1.512194435s" podCreationTimestamp="2026-04-20 15:06:47 +0000 UTC" firstStartedPulling="2026-04-20 15:06:47.680915751 +0000 UTC m=+799.412234018" lastFinishedPulling="2026-04-20 15:06:48.121795779 +0000 UTC m=+799.853114054" observedRunningTime="2026-04-20 15:06:48.511095198 +0000 UTC m=+800.242413503" watchObservedRunningTime="2026-04-20 15:06:48.512194435 +0000 UTC m=+800.243512723" Apr 20 15:06:59.504909 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:06:59.504874 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f96598d7b-t5qwc" Apr 20 15:08:28.846857 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:08:28.846765 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:08:28.847621 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:08:28.847404 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:08:28.852951 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:08:28.852929 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:08:28.853145 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:08:28.853130 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:13:28.873351 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:13:28.873322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:13:28.874732 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:13:28.874707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:13:28.879075 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:13:28.879049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:13:28.880178 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:13:28.880152 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:18:28.898578 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:18:28.898550 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:18:28.901224 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:18:28.901201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:18:28.904568 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:18:28.904548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:18:28.906876 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:18:28.906859 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:23:28.926686 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:23:28.926653 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:23:28.928917 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:23:28.928894 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:23:28.931999 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:23:28.931979 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:23:28.934195 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:23:28.934179 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:27:53.261364 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:53.261284 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-f96598d7b-t5qwc_bab56dd7-c872-4589-9032-143fb2f51f53/manager/0.log" Apr 20 15:27:53.771758 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:53.771726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-854569cf8c-zr74q_1301f538-12c6-4361-8cfa-37d9a2f7f4be/manager/0.log" Apr 20 15:27:53.889711 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:53.889672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rm7ll_94d98787-7dcd-442b-bfea-1e1cb9833889/postgres/0.log" Apr 20 15:27:56.898423 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:56.898374 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-597dfdc786-kfh7q_95518758-6800-47d6-a2b2-133367bc4bf8/kube-auth-proxy/0.log" Apr 20 15:27:57.847997 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:57.847958 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj_dc00640a-59de-471b-b4f9-bc8725ae64b2/main/0.log" Apr 20 15:27:57.855147 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:57.855125 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-dsfvj_dc00640a-59de-471b-b4f9-bc8725ae64b2/storage-initializer/0.log" Apr 20 15:27:57.985079 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:57.985049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-nvdqw_8452376a-6fe6-4616-b5f0-3702f95b1d4b/storage-initializer/0.log" Apr 20 15:27:57.992655 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:57.992635 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-nvdqw_8452376a-6fe6-4616-b5f0-3702f95b1d4b/main/0.log" Apr 20 15:27:58.127214 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:58.127093 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br_fb668fe3-743d-479c-807b-ee99d9df6fa0/storage-initializer/0.log" Apr 20 15:27:58.135023 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:58.134989 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcct55br_fb668fe3-743d-479c-807b-ee99d9df6fa0/main/0.log" Apr 20 15:27:58.258853 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:58.258819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq_57b489ab-a8d9-4a73-b1cb-f77423937f72/storage-initializer/0.log" Apr 20 15:27:58.266011 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:27:58.265987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-4rfqq_57b489ab-a8d9-4a73-b1cb-f77423937f72/main/0.log" Apr 20 15:28:05.428246 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:05.428219 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ht2vg_2d3ee4f7-b2d5-4a5c-985e-0cac25e122ae/global-pull-secret-syncer/0.log" Apr 20 15:28:05.627757 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:05.627727 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w4psc_f4236e7c-46e3-443b-9430-39ff80fbd8dc/konnectivity-agent/0.log" Apr 20 15:28:05.647422 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:05.647378 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-82.ec2.internal_c510f351159c758475778f44c8d7da56/haproxy/0.log" Apr 20 15:28:11.701007 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:11.700976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mk8bk_c974df5b-afcc-4232-9913-36d4d36cd14b/kube-state-metrics/0.log" Apr 20 15:28:11.723180 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:11.723154 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mk8bk_c974df5b-afcc-4232-9913-36d4d36cd14b/kube-rbac-proxy-main/0.log" Apr 20 15:28:11.749739 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:11.749711 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mk8bk_c974df5b-afcc-4232-9913-36d4d36cd14b/kube-rbac-proxy-self/0.log" Apr 20 15:28:11.778186 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:11.778160 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-57b4bd5cff-588gf_fe9129b1-1735-446d-a04a-9eceeb28fad7/metrics-server/0.log" Apr 20 15:28:11.967651 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:11.967619 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n6qq5_7cde96ba-1beb-4dd0-91b3-6c4339468969/node-exporter/0.log" Apr 20 15:28:12.028656 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.028629 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n6qq5_7cde96ba-1beb-4dd0-91b3-6c4339468969/kube-rbac-proxy/0.log" Apr 20 15:28:12.056945 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.056919 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n6qq5_7cde96ba-1beb-4dd0-91b3-6c4339468969/init-textfile/0.log" Apr 20 15:28:12.191306 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.191227 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cq7ww_fe2df217-e552-4eed-993d-b467cccf24b4/kube-rbac-proxy-main/0.log" Apr 20 15:28:12.221502 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.221473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cq7ww_fe2df217-e552-4eed-993d-b467cccf24b4/kube-rbac-proxy-self/0.log" Apr 20 15:28:12.247689 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.247666 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cq7ww_fe2df217-e552-4eed-993d-b467cccf24b4/openshift-state-metrics/0.log" Apr 20 15:28:12.289131 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.289107 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_34eda7fa-4142-48e1-920d-a1a1d107166c/prometheus/0.log" Apr 20 15:28:12.317465 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.317434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_34eda7fa-4142-48e1-920d-a1a1d107166c/config-reloader/0.log" Apr 20 15:28:12.343775 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.343748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_34eda7fa-4142-48e1-920d-a1a1d107166c/thanos-sidecar/0.log" Apr 20 15:28:12.365216 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.365189 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_34eda7fa-4142-48e1-920d-a1a1d107166c/kube-rbac-proxy-web/0.log" Apr 20 15:28:12.398292 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.398265 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_34eda7fa-4142-48e1-920d-a1a1d107166c/kube-rbac-proxy/0.log" Apr 20 15:28:12.421227 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.421197 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_34eda7fa-4142-48e1-920d-a1a1d107166c/kube-rbac-proxy-thanos/0.log" Apr 20 15:28:12.442581 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.442488 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_34eda7fa-4142-48e1-920d-a1a1d107166c/init-config-reloader/0.log" Apr 20 15:28:12.471817 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.471789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zjqc7_af3e061e-0ffd-4228-88bb-e228561992bd/prometheus-operator/0.log" Apr 20 15:28:12.496271 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.496242 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zjqc7_af3e061e-0ffd-4228-88bb-e228561992bd/kube-rbac-proxy/0.log" Apr 20 15:28:12.522734 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:12.522697 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-pgpn6_8d600620-fcd1-47ac-884e-38d6ff2fb62c/prometheus-operator-admission-webhook/0.log" Apr 20 15:28:13.848155 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:13.848103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-5flsl_6cda8435-e869-40a8-9726-f7b6d4767009/networking-console-plugin/0.log" Apr 20 15:28:14.351434 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.351390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:28:14.354026 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.354007 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/3.log" Apr 20 15:28:14.586916 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.586880 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn"] Apr 20 15:28:14.590788 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.590765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.593176 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.593154 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrbql\"/\"kube-root-ca.crt\"" Apr 20 15:28:14.593349 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.593329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrbql\"/\"default-dockercfg-whr8p\"" Apr 20 15:28:14.594000 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.593985 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrbql\"/\"openshift-service-ca.crt\"" Apr 20 15:28:14.598652 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.598503 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn"] Apr 20 15:28:14.656934 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.656840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8sr7\" (UniqueName: \"kubernetes.io/projected/51645812-3709-4977-8661-9f9b966f4708-kube-api-access-z8sr7\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.656934 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.656893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-podres\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.657153 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.657007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-sys\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.657153 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.657031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-lib-modules\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.657153 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.657051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-proc\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.757843 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.757807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-sys\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.757843 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.757840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-lib-modules\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.758097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.757858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-proc\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.758097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.757908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sr7\" (UniqueName: \"kubernetes.io/projected/51645812-3709-4977-8661-9f9b966f4708-kube-api-access-z8sr7\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.758097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.757947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-podres\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.758097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.757949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-sys\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.758097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.758005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-lib-modules\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.758097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.758014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-proc\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.758097 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.758040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/51645812-3709-4977-8661-9f9b966f4708-podres\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.766507 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.766488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8sr7\" (UniqueName: \"kubernetes.io/projected/51645812-3709-4977-8661-9f9b966f4708-kube-api-access-z8sr7\") pod \"perf-node-gather-daemonset-qb6mn\" (UID: \"51645812-3709-4977-8661-9f9b966f4708\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:14.903223 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:14.903177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:15.025798 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:15.025773 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn"] Apr 20 15:28:15.028471 ip-10-0-129-82 kubenswrapper[2575]: W0420 15:28:15.028438 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod51645812_3709_4977_8661_9f9b966f4708.slice/crio-a7f2153d1aaaa6c97df465b7e93656e1110ab0438a7dc3dfe3691de36eb74bae WatchSource:0}: Error finding container a7f2153d1aaaa6c97df465b7e93656e1110ab0438a7dc3dfe3691de36eb74bae: Status 404 returned error can't find the container with id a7f2153d1aaaa6c97df465b7e93656e1110ab0438a7dc3dfe3691de36eb74bae Apr 20 15:28:15.030117 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:15.030095 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:28:16.036625 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.036583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" event={"ID":"51645812-3709-4977-8661-9f9b966f4708","Type":"ContainerStarted","Data":"3c057625f733b43136df66e307609db554200708ce953781c6d2991908f6c07b"} Apr 20 15:28:16.036625 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.036630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" event={"ID":"51645812-3709-4977-8661-9f9b966f4708","Type":"ContainerStarted","Data":"a7f2153d1aaaa6c97df465b7e93656e1110ab0438a7dc3dfe3691de36eb74bae"} Apr 20 15:28:16.037345 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.036720 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:16.053735 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.053684 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" podStartSLOduration=2.053666784 podStartE2EDuration="2.053666784s" podCreationTimestamp="2026-04-20 15:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:28:16.05221372 +0000 UTC m=+2087.783532008" watchObservedRunningTime="2026-04-20 15:28:16.053666784 +0000 UTC m=+2087.784985066" Apr 20 15:28:16.230972 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.230943 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t7cf5_b3af7863-723b-45a3-8247-7e29b9a9da3c/dns/0.log" Apr 20 15:28:16.252025 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.252002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t7cf5_b3af7863-723b-45a3-8247-7e29b9a9da3c/kube-rbac-proxy/0.log" Apr 20 15:28:16.372473 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.372398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hscb7_d8c441a8-d683-4309-8a59-c1525285f7e1/dns-node-resolver/0.log" Apr 20 15:28:16.846998 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:16.846972 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9vx92_007a8d25-f684-41a4-a2f6-d4e2f7bd79d5/node-ca/0.log" Apr 20 15:28:17.896908 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:17.896879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-597dfdc786-kfh7q_95518758-6800-47d6-a2b2-133367bc4bf8/kube-auth-proxy/0.log" Apr 20 15:28:18.529159 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:18.529128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fccb4_a808e761-5c95-412e-a362-7e3ffb34caeb/serve-healthcheck-canary/0.log" Apr 20 15:28:19.200053 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:19.200023 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xf6wz_c79acd36-8478-42d1-bf37-0e5b738f4737/kube-rbac-proxy/0.log" Apr 20 15:28:19.222983 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:19.222949 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xf6wz_c79acd36-8478-42d1-bf37-0e5b738f4737/exporter/0.log" Apr 20 15:28:19.244164 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:19.244136 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xf6wz_c79acd36-8478-42d1-bf37-0e5b738f4737/extractor/0.log" Apr 20 15:28:21.274182 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:21.274148 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-f96598d7b-t5qwc_bab56dd7-c872-4589-9032-143fb2f51f53/manager/0.log" Apr 20 15:28:21.436222 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:21.436116 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-854569cf8c-zr74q_1301f538-12c6-4361-8cfa-37d9a2f7f4be/manager/0.log" Apr 20 15:28:21.463456 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:21.463427 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rm7ll_94d98787-7dcd-442b-bfea-1e1cb9833889/postgres/0.log" Apr 20 15:28:22.050018 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:22.049989 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-qb6mn" Apr 20 15:28:22.593781 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:22.593749 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54f6c466b9-rdk9d_375e227b-169e-4f82-8fc0-f666eb13f899/manager/0.log" Apr 20 15:28:27.560688 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:27.560655 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8l5tw_fce9aead-ae79-449d-9e77-55a7a14471b5/kube-storage-version-migrator-operator/1.log" Apr 20 15:28:27.561436 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:27.561423 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8l5tw_fce9aead-ae79-449d-9e77-55a7a14471b5/kube-storage-version-migrator-operator/0.log" Apr 20 15:28:28.576381 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.576319 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-59qmp_326469e8-2bee-4754-a084-2cfc2ffe79a2/kube-multus/0.log" Apr 20 15:28:28.906203 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.906130 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzx48_b8808028-95d4-494d-8038-d6152f52c0e3/kube-multus-additional-cni-plugins/0.log" Apr 20 15:28:28.926457 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.926430 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzx48_b8808028-95d4-494d-8038-d6152f52c0e3/egress-router-binary-copy/0.log" Apr 20 15:28:28.947719 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.947694 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzx48_b8808028-95d4-494d-8038-d6152f52c0e3/cni-plugins/0.log" Apr 20 15:28:28.952665 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.952639 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:28:28.955419 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.955400 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-b8vvq_a3b8c0ca-6a14-4aa3-b779-8722694554e7/console-operator/2.log" Apr 20 15:28:28.958579 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.958560 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:28:28.961256 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.961240 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:28:28.972709 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.972690 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzx48_b8808028-95d4-494d-8038-d6152f52c0e3/bond-cni-plugin/0.log" Apr 20 15:28:28.996098 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:28.996082 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzx48_b8808028-95d4-494d-8038-d6152f52c0e3/routeoverride-cni/0.log" Apr 20 15:28:29.020678 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:29.020648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzx48_b8808028-95d4-494d-8038-d6152f52c0e3/whereabouts-cni-bincopy/0.log" Apr 20 15:28:29.041825 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:29.041799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzx48_b8808028-95d4-494d-8038-d6152f52c0e3/whereabouts-cni/0.log" Apr 20 15:28:29.227068 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:29.227036 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vbpm4_aaf83337-5403-4bd0-b782-5d5fa014368f/network-metrics-daemon/0.log" Apr 20 15:28:29.246745 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:29.246719 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vbpm4_aaf83337-5403-4bd0-b782-5d5fa014368f/kube-rbac-proxy/0.log" Apr 20 15:28:30.714158 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.714128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-controller/0.log" Apr 20 15:28:30.732080 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.732050 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/0.log" Apr 20 15:28:30.741576 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.741552 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovn-acl-logging/1.log" Apr 20 15:28:30.760556 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.760499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/kube-rbac-proxy-node/0.log" Apr 20 15:28:30.785201 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.785174 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:28:30.803595 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.803572 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/northd/0.log" Apr 20 15:28:30.823607 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.823586 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/nbdb/0.log" Apr 20 15:28:30.843039 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.843017 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/sbdb/0.log" Apr 20 15:28:30.946484 ip-10-0-129-82 kubenswrapper[2575]: I0420 15:28:30.946452 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8mhl_3b839cc0-9133-43ab-a8ea-a31b28df87b2/ovnkube-controller/0.log"