Apr 17 07:52:12.235600 ip-10-0-137-165 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:52:12.641835 ip-10-0-137-165 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:12.641835 ip-10-0-137-165 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:52:12.641835 ip-10-0-137-165 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:12.641835 ip-10-0-137-165 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:52:12.641835 ip-10-0-137-165 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:12.643462 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.643293 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:52:12.648321 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648296 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:12.648321 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648315 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:12.648321 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648320 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:12.648321 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648325 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:12.648321 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648329 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648333 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648338 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648341 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648346 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648350 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648353 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648358 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648362 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648365 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648369 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648373 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648377 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648384 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648388 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648392 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648396 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648406 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648410 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:12.648617 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648414 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648418 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648421 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648425 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648429 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648433 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648436 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648440 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648443 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648447 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648451 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648455 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648458 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648462 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648466 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648472 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648477 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648481 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648486 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648490 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:12.649331 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648494 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648498 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648502 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648506 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648511 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648516 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648520 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648524 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648528 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648532 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648536 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648540 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648544 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648548 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648552 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648556 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648560 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648564 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648568 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648574 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:12.649862 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648578 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648583 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648587 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648591 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648595 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648600 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648604 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648608 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648613 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648617 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648620 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648626 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648631 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648638 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648644 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648651 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648656 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648661 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648665 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648669 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:12.650468 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648673 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:12.650974 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648677 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:12.650974 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.648681 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:12.651237 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651221 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:12.651237 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651236 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651241 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651246 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651250 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651254 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651258 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651263 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651267 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651271 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651291 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651296 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651299 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651303 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651307 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651311 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651318 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651324 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651328 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651332 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651337 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:12.651383 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651341 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651345 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651349 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651355 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651359 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651363 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651367 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651371 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651376 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651380 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651384 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651387 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651392 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651396 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651400 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651404 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651408 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651412 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651416 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:12.652220 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651424 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651430 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651434 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651438 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651443 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651446 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651450 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651454 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651458 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651463 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651467 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651471 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651475 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651479 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651483 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651488 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651492 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651497 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651502 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651506 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:12.653131 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651510 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651515 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651519 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651523 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651527 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651532 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651536 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651540 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651545 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651549 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651553 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651560 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651564 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651568 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651572 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651577 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651581 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651585 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651589 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651593 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:12.653710 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651597 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651601 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651606 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651609 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651613 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.651618 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651735 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651747 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651763 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651770 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651778 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651783 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651790 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651797 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651803 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651808 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651814 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651819 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651824 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651829 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651833 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651838 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651843 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 07:52:12.654324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651848 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651853 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651859 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651863 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651869 2576 flags.go:64] FLAG: --config-dir="" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651873 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651878 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651885 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651890 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651895 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651901 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651905 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651910 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651923 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651928 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651933 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651941 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651947 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651951 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651956 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651962 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651968 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651975 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651980 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651985 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:52:12.655040 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651990 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.651995 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652001 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652005 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652010 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652014 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652019 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652024 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652029 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652034 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652038 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652043 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652048 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652054 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652059 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652064 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652069 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652074 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652079 2576 flags.go:64] FLAG: --help="false" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652084 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-165.ec2.internal" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652090 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652095 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652100 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652106 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:52:12.655778 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652112 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652118 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652123 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652127 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652133 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652139 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652144 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652149 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652153 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652158 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652163 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652168 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652173 2576 flags.go:64] FLAG: --lock-file="" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652177 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652182 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652187 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652196 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652201 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652206 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652210 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652215 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652220 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652225 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652230 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652237 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:52:12.656497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652242 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652256 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652261 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652266 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652285 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652291 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652296 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652301 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652307 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652319 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652324 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652329 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652335 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652340 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652348 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652353 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652358 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652363 2576 flags.go:64] FLAG: --port="10250" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652368 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652373 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02b0c546307765043" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652378 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652382 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652387 2576 flags.go:64] FLAG: --register-node="true" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652393 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:52:12.657106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652397 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652404 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652409 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652413 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652418 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652424 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652429 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652434 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652439 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652444 2576 flags.go:64] FLAG: --runonce="false" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652449 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652455 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652459 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652464 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652469 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652474 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652479 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652484 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652489 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652493 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652498 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652504 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652509 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652513 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652518 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:52:12.657751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652525 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652530 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652539 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652547 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652551 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652556 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652562 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652567 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652572 2576 flags.go:64] FLAG: --v="2" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652579 2576 flags.go:64] FLAG: --version="false" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652586 2576 flags.go:64] FLAG: --vmodule="" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652593 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.652598 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652755 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652763 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652769 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652774 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652778 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652783 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652787 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652791 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652796 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652800 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:12.658375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652805 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652809 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652813 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652817 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652825 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652831 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652836 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652841 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652846 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652850 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652855 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652861 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652865 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652869 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652876 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652880 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652884 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652888 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652892 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:12.658944 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652896 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652901 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652905 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652909 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652914 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652918 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652922 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652927 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652931 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652936 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652940 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652944 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652948 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652952 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652956 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652960 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652965 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652970 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652974 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652978 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:12.659556 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652983 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652987 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652992 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.652996 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653002 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653006 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653010 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653017 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653022 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653026 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653030 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653034 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653039 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653042 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653046 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653052 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653056 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653060 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653065 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:12.660063 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653069 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653073 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653077 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653081 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653086 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653090 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653096 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653102 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653106 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653111 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653114 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653118 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653122 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653127 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653132 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653136 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653140 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:12.660557 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.653146 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.654049 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.660726 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.660744 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660794 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660799 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660802 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660806 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660809 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660813 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660816 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660819 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660821 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660824 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660827 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:12.660975 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660830 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660832 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660835 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660838 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660840 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660860 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660865 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660868 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660871 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660874 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660877 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660880 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660882 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660885 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660888 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660890 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660893 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660896 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660898 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660901 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:12.661375 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660905 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660908 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660911 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660914 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660917 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660919 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660922 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660924 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660927 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660929 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660932 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660934 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660937 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660939 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660942 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660946 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660949 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660952 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660955 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660958 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:12.661877 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660960 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660963 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660965 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660968 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660970 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660972 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660975 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660978 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660980 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660983 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660986 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660989 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660992 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660996 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.660998 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661001 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661004 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661006 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661009 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661011 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:12.662381 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661014 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661017 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661020 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661022 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661025 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661028 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661032 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661035 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661038 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661040 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661043 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661045 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661048 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661050 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661053 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.661058 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:12.662868 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661155 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661158 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661162 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661165 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661168 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661171 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661174 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661177 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661181 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661184 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661188 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661190 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661193 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661198 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661201 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661204 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661207 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661210 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661213 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:12.663324 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661216 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661218 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661221 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661224 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661227 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661230 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661232 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661235 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661238 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661241 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661243 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661246 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661248 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661251 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661253 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661256 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661258 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661261 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661263 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:12.663787 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661266 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661268 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661284 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661287 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661290 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661293 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661295 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661298 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661301 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661303 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661306 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661309 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661312 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661314 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661317 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661320 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661322 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661325 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661328 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661330 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:12.664270 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661332 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661335 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661337 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661340 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661343 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661345 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661348 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661351 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661353 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661356 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661358 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661361 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661363 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661366 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661370 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661373 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661375 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661378 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661381 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661384 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:12.664850 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661387 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661390 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661394 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661397 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661399 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661402 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661404 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:12.661407 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.661412 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:12.665359 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.662121 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:52:12.665907 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.665892 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:52:12.667249 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.667230 2576 server.go:1019] "Starting client certificate rotation" Apr 17 07:52:12.667354 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.667339 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:52:12.667386 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.667379 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:52:12.690469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.690450 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:52:12.693425 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.693395 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:52:12.706159 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.706138 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:52:12.711502 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.711484 2576 log.go:25] "Validated CRI v1 image API" Apr 17 07:52:12.714493 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.714479 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:52:12.716805 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.716780 2576 fs.go:135] Filesystem UUIDs: map[476c130c-893c-48b5-9adc-c29aaff1dbcb:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8d9b5259-47fe-4645-8625-f7546197f68e:/dev/nvme0n1p4] Apr 17 07:52:12.716897 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.716802 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:52:12.722237 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.722097 2576 manager.go:217] Machine: {Timestamp:2026-04-17 07:52:12.720542266 +0000 UTC m=+0.377405473 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099836 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec288385a0745b2802ea05cfb2fe9e6e SystemUUID:ec288385-a074-5b28-02ea-05cfb2fe9e6e BootID:bc63bafb-87ba-4e42-b08b-8002780779b7 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ab:0a:41:06:ef Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ab:0a:41:06:ef Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:e8:d7:65:0b:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:52:12.722237 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.722211 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:52:12.722237 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.722222 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:52:12.722465 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.722356 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:52:12.725220 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.725190 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:52:12.725411 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.725225 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-165.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:52:12.725499 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.725422 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:52:12.725499 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.725436 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:52:12.725499 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.725453 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:52:12.726783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.726770 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:52:12.728423 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.728410 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:52:12.728562 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.728550 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:52:12.731401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.731389 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:52:12.731475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.731416 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:52:12.731475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.731438 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:52:12.731475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.731454 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:52:12.731475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.731466 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:52:12.732652 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.732638 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:52:12.732732 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.732662 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:52:12.735988 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.735969 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:52:12.737508 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.737495 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:52:12.739466 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739452 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739471 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739478 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739484 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739490 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739497 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739513 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739520 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:52:12.739525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739527 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:52:12.739774 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739533 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:52:12.739774 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739545 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:52:12.739774 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.739554 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:52:12.740317 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.740305 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:52:12.740351 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.740318 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:52:12.743876 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.743857 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-165.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 07:52:12.743969 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.743873 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:52:12.744306 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.744264 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-165.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:52:12.744393 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.744364 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:52:12.744487 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.744469 2576 server.go:1295] "Started kubelet" Apr 17 07:52:12.745512 ip-10-0-137-165 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:52:12.745770 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.745589 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:52:12.745770 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.745578 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:52:12.745770 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.745755 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:52:12.746902 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.746884 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:52:12.748809 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.748794 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:52:12.752528 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.752503 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:52:12.752528 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.752525 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:52:12.753126 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.752130 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-165.ec2.internal.18a7159ad3da0181 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-165.ec2.internal,UID:ip-10-0-137-165.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-165.ec2.internal,},FirstTimestamp:2026-04-17 07:52:12.744376705 +0000 UTC m=+0.401239916,LastTimestamp:2026-04-17 07:52:12.744376705 +0000 UTC m=+0.401239916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-165.ec2.internal,}" Apr 17 07:52:12.753221 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753180 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:52:12.753221 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753198 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:52:12.753221 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753220 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:52:12.753404 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753292 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:52:12.753404 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753324 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:52:12.753514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753496 2576 factory.go:55] Registering systemd factory Apr 17 07:52:12.753565 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753517 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:52:12.753565 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.753529 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:12.753883 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753865 2576 factory.go:153] Registering CRI-O factory Apr 17 07:52:12.753937 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753889 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 07:52:12.753987 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753944 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:52:12.753987 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753981 2576 factory.go:103] Registering Raw factory Apr 17 07:52:12.754070 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.753999 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 07:52:12.754433 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.754416 2576 manager.go:319] Starting recovery of all containers Apr 17 07:52:12.760338 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.760306 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 07:52:12.760591 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.760523 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-165.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 07:52:12.761341 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.761309 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s45mj" Apr 17 07:52:12.764742 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.764723 2576 manager.go:324] Recovery completed Apr 17 07:52:12.769038 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.769023 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s45mj" Apr 17 07:52:12.769793 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.769781 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:12.773524 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.772947 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:12.773617 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.773549 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:12.773617 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.773569 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:12.774239 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.774220 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:52:12.774239 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.774237 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:52:12.774395 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.774259 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:52:12.776344 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.776255 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-165.ec2.internal.18a7159ad596c8b1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-165.ec2.internal,UID:ip-10-0-137-165.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-165.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-165.ec2.internal,},FirstTimestamp:2026-04-17 07:52:12.773525681 +0000 UTC m=+0.430388888,LastTimestamp:2026-04-17 07:52:12.773525681 +0000 UTC m=+0.430388888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-165.ec2.internal,}" Apr 17 07:52:12.776762 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.776739 2576 policy_none.go:49] "None policy: Start" Apr 17 07:52:12.776837 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.776770 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:52:12.776837 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.776781 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:52:12.820017 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.819996 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.820038 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.820051 2576 server.go:85] "Starting device plugin registration server" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.820293 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.820305 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.820398 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.820483 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.820491 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.820888 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:52:12.841026 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.820930 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:12.856167 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.856137 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:52:12.857338 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.857317 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:52:12.857338 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.857341 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:52:12.857461 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.857366 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:52:12.857461 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.857373 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:52:12.857461 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.857452 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:52:12.860499 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.860482 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:12.921329 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.921249 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:12.922253 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.922235 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:12.922346 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.922269 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:12.922346 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.922303 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:12.922346 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.922331 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-165.ec2.internal" Apr 17 07:52:12.930857 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.930842 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-165.ec2.internal" Apr 17 07:52:12.930922 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.930864 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-165.ec2.internal\": node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:12.948721 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.948695 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:12.958103 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.958085 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal"] Apr 17 07:52:12.958170 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.958151 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:12.958952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.958939 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:12.959003 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.958966 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:12.959003 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.958980 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:12.960302 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.960253 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:12.960402 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.960387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:12.960451 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.960420 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:12.961007 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.960993 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:12.961073 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.961010 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:12.961073 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.961023 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:12.961073 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.961033 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:12.961073 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.961037 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:12.961073 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.961047 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:12.962055 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.962036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" Apr 17 07:52:12.962152 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.962072 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:12.962739 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.962722 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:12.962824 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.962749 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:12.962824 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:12.962759 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:12.990891 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.990866 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-165.ec2.internal\" not found" node="ip-10-0-137-165.ec2.internal" Apr 17 07:52:12.995196 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:12.995180 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-165.ec2.internal\" not found" node="ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.049536 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.049513 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.055910 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.055887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3bed48d9758f9b62336a701f0447a55a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal\" (UID: \"3bed48d9758f9b62336a701f0447a55a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.056006 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.055923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bed48d9758f9b62336a701f0447a55a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal\" (UID: \"3bed48d9758f9b62336a701f0447a55a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.056006 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.055961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9a551efa739e7b700ba0ce69f0532dd7-config\") pod \"kube-apiserver-proxy-ip-10-0-137-165.ec2.internal\" (UID: \"9a551efa739e7b700ba0ce69f0532dd7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.149804 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.149770 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.156182 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.156163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3bed48d9758f9b62336a701f0447a55a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal\" (UID: \"3bed48d9758f9b62336a701f0447a55a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.156265 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.156192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bed48d9758f9b62336a701f0447a55a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal\" (UID: \"3bed48d9758f9b62336a701f0447a55a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.156265 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.156208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9a551efa739e7b700ba0ce69f0532dd7-config\") pod \"kube-apiserver-proxy-ip-10-0-137-165.ec2.internal\" (UID: \"9a551efa739e7b700ba0ce69f0532dd7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.156265 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.156243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9a551efa739e7b700ba0ce69f0532dd7-config\") pod \"kube-apiserver-proxy-ip-10-0-137-165.ec2.internal\" (UID: \"9a551efa739e7b700ba0ce69f0532dd7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.156265 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.156249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3bed48d9758f9b62336a701f0447a55a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal\" (UID: \"3bed48d9758f9b62336a701f0447a55a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.156405 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.156263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bed48d9758f9b62336a701f0447a55a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal\" (UID: \"3bed48d9758f9b62336a701f0447a55a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.250647 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.250587 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.292047 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.292023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.297577 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.297559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.351250 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.351221 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.451729 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.451689 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.552266 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.552189 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.652633 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.652605 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.667029 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.667003 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:52:13.667156 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.667139 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:52:13.753584 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.753398 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.753584 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.753437 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:52:13.762475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.762456 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:52:13.771507 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.771474 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:47:12 +0000 UTC" deadline="2027-10-06 18:49:32.368069784 +0000 UTC" Apr 17 07:52:13.771583 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.771508 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12898h57m18.596565119s" Apr 17 07:52:13.787375 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.787350 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9fkw8" Apr 17 07:52:13.788942 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:13.788906 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a551efa739e7b700ba0ce69f0532dd7.slice/crio-88ae84abead0893b9130e82634c331bb7cefc11e9e2f5eefee4a2514ba2b33f6 WatchSource:0}: Error finding container 88ae84abead0893b9130e82634c331bb7cefc11e9e2f5eefee4a2514ba2b33f6: Status 404 returned error can't find the container with id 88ae84abead0893b9130e82634c331bb7cefc11e9e2f5eefee4a2514ba2b33f6 Apr 17 07:52:13.789364 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:13.789340 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bed48d9758f9b62336a701f0447a55a.slice/crio-b2aa44c600f1d3eae8e804db725e1aeabdcbd6468d5e2715d582476e1c52e6aa WatchSource:0}: Error finding container b2aa44c600f1d3eae8e804db725e1aeabdcbd6468d5e2715d582476e1c52e6aa: Status 404 returned error can't find the container with id b2aa44c600f1d3eae8e804db725e1aeabdcbd6468d5e2715d582476e1c52e6aa Apr 17 07:52:13.793552 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.793534 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:52:13.794782 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.794765 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9fkw8" Apr 17 07:52:13.853826 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:13.853738 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-165.ec2.internal\" not found" Apr 17 07:52:13.860720 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.860681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" event={"ID":"3bed48d9758f9b62336a701f0447a55a","Type":"ContainerStarted","Data":"b2aa44c600f1d3eae8e804db725e1aeabdcbd6468d5e2715d582476e1c52e6aa"} Apr 17 07:52:13.861588 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.861562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" event={"ID":"9a551efa739e7b700ba0ce69f0532dd7","Type":"ContainerStarted","Data":"88ae84abead0893b9130e82634c331bb7cefc11e9e2f5eefee4a2514ba2b33f6"} Apr 17 07:52:13.863509 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.862338 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:13.952783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.952736 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.965739 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.965715 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:52:13.967205 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.967192 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" Apr 17 07:52:13.975394 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:13.975378 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:52:14.232053 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.231792 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:14.307934 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.307892 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:14.732988 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.732965 2576 apiserver.go:52] "Watching apiserver" Apr 17 07:52:14.739700 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.739664 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:52:14.740807 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.740779 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wxvbl","openshift-network-diagnostics/network-check-target-rw8ct","openshift-network-operator/iptables-alerter-fkhld","openshift-ovn-kubernetes/ovnkube-node-h7zs8","kube-system/konnectivity-agent-jxxvl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67","openshift-dns/node-resolver-xvmfc","openshift-image-registry/node-ca-ssttk","openshift-multus/multus-9lt4x","kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal","openshift-cluster-node-tuning-operator/tuned-cpkz9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal","openshift-multus/multus-additional-cni-plugins-w4n4s"] Apr 17 07:52:14.742976 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.742956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.744384 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.744362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.745175 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.745155 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:52:14.745285 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.745155 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:52:14.745285 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.745164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bx2wj\"" Apr 17 07:52:14.745389 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.745324 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:52:14.745698 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.745682 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:14.745806 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.745783 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:14.746469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.746453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:52:14.746570 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.746495 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:52:14.746721 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.746703 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:52:14.746791 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.746711 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sj2h5\"" Apr 17 07:52:14.746851 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.746814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.747387 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.747230 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:52:14.747387 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.747269 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:52:14.748063 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.748027 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:14.749344 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.749084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:52:14.749474 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.749455 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:52:14.749596 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.749563 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4mb8j\"" Apr 17 07:52:14.749682 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.749665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.750941 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.750560 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:52:14.750941 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.750884 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d2rqr\"" Apr 17 07:52:14.751362 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.751092 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:52:14.752221 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.752201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:52:14.752400 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.752378 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.752506 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.752487 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:52:14.752674 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.752587 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:52:14.752921 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.752819 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-xdwvs\"" Apr 17 07:52:14.753058 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.753012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:52:14.753941 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.753775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:52:14.753941 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.753812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:52:14.754097 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.754046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.754540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.754521 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:52:14.754783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.754754 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:52:14.754897 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.754796 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:52:14.754961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.754890 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-m2q46\"" Apr 17 07:52:14.755537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.755517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.756060 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.756039 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wqc64\"" Apr 17 07:52:14.756365 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.756344 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:52:14.756494 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.756471 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:52:14.756561 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.756522 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:52:14.757433 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.757414 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:52:14.757531 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.757457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.757531 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.757469 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xjh45\"" Apr 17 07:52:14.758719 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.758703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:14.758810 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.758754 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:14.759166 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.759150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:52:14.759593 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.759539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ft4ks\"" Apr 17 07:52:14.759593 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.759582 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:52:14.763063 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763063 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-slash\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763232 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5cf29621-68bf-43a5-94a8-643b390fca92-hosts-file\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.763232 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-kubelet\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763232 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-var-lib-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763232 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-cni-netd\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763232 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-systemd-units\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763468 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-cni-bin\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763468 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovnkube-config\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763468 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9n8v\" (UniqueName: \"kubernetes.io/projected/d0fdfd60-0abc-4f38-bff8-7936432cb97b-kube-api-access-g9n8v\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763468 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhttb\" (UniqueName: \"kubernetes.io/projected/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-kube-api-access-rhttb\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.763468 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-system-cni-dir\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.763468 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z4vt\" (UniqueName: \"kubernetes.io/projected/0cd46437-1e4d-4927-88fe-3d5f18ee621d-kube-api-access-2z4vt\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85497\" (UniqueName: \"kubernetes.io/projected/28fe626d-8e66-482b-b03e-847bb5829a0b-kube-api-access-85497\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-run-ovn-kubernetes\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-host-slash\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7kr4\" (UniqueName: \"kubernetes.io/projected/5cf29621-68bf-43a5-94a8-643b390fca92-kube-api-access-r7kr4\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-run-netns\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-systemd\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.763733 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-etc-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovnkube-script-lib\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-device-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-etc-selinux\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-sys-fs\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-os-release\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.763984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/502b7429-f580-4079-ad03-6c6b86f1903f-konnectivity-ca\") pod \"konnectivity-agent-jxxvl\" (UID: \"502b7429-f580-4079-ad03-6c6b86f1903f\") " pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-log-socket\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovn-node-metrics-cert\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-socket-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-node-log\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-env-overrides\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk9t6\" (UniqueName: \"kubernetes.io/projected/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-kube-api-access-lk9t6\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.764189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-cnibin\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-registration-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-iptables-alerter-script\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-host\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-serviceca\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjtcn\" (UniqueName: \"kubernetes.io/projected/b284d533-260c-471d-b97a-e7e5c490b1da-kube-api-access-pjtcn\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5cf29621-68bf-43a5-94a8-643b390fca92-tmp-dir\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/502b7429-f580-4079-ad03-6c6b86f1903f-agent-certs\") pod \"konnectivity-agent-jxxvl\" (UID: \"502b7429-f580-4079-ad03-6c6b86f1903f\") " pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:14.764783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.764475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-ovn\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.793587 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.793539 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:14.795504 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.795476 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:47:13 +0000 UTC" deadline="2027-10-04 01:14:47.190854774 +0000 UTC" Apr 17 07:52:14.795504 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.795503 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12833h22m32.395355118s" Apr 17 07:52:14.854941 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.854913 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:52:14.864990 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.864960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-kubernetes\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.865145 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.864999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:14.865145 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-iptables-alerter-script\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.865145 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7kr4\" (UniqueName: \"kubernetes.io/projected/5cf29621-68bf-43a5-94a8-643b390fca92-kube-api-access-r7kr4\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.865145 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-etc-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865145 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovnkube-script-lib\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865145 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-cni-bin\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.865145 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysctl-d\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.865473 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-etc-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865473 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-os-release\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.865473 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-log-socket\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865473 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-os-release\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.865473 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-lib-modules\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.865473 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-log-socket\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865473 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-host\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-socket-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-node-log\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-cnibin\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-cni-multus\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-node-log\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-iptables-alerter-script\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovnkube-script-lib\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-etc-kubernetes\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-cnibin\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-socket-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.865760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-cnibin\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfz2m\" (UniqueName: \"kubernetes.io/projected/70917b3b-92d9-4406-9795-92a9f4be21ea-kube-api-access-cfz2m\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-tuned\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjtcn\" (UniqueName: \"kubernetes.io/projected/b284d533-260c-471d-b97a-e7e5c490b1da-kube-api-access-pjtcn\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.865884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5cf29621-68bf-43a5-94a8-643b390fca92-tmp-dir\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-ovn\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-system-cni-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-slash\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5cf29621-68bf-43a5-94a8-643b390fca92-hosts-file\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-ovn\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-systemd-units\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9n8v\" (UniqueName: \"kubernetes.io/projected/d0fdfd60-0abc-4f38-bff8-7936432cb97b-kube-api-access-g9n8v\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-slash\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhttb\" (UniqueName: \"kubernetes.io/projected/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-kube-api-access-rhttb\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5cf29621-68bf-43a5-94a8-643b390fca92-hosts-file\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-systemd-units\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-hostroot\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5cf29621-68bf-43a5-94a8-643b390fca92-tmp-dir\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85497\" (UniqueName: \"kubernetes.io/projected/28fe626d-8e66-482b-b03e-847bb5829a0b-kube-api-access-85497\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-run-ovn-kubernetes\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-host-slash\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-cni-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-run-ovn-kubernetes\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-k8s-cni-cncf-io\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-host-slash\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.866961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70917b3b-92d9-4406-9795-92a9f4be21ea-cni-binary-copy\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-run-netns\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-systemd\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-modprobe-d\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-device-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-systemd\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-run-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-etc-selinux\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-sys-fs\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-device-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-run-netns\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-etc-selinux\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/502b7429-f580-4079-ad03-6c6b86f1903f-konnectivity-ca\") pod \"konnectivity-agent-jxxvl\" (UID: \"502b7429-f580-4079-ad03-6c6b86f1903f\") " pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovn-node-metrics-cert\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.867717 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.866932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-socket-dir-parent\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-kubelet\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-sys\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-sys-fs\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-env-overrides\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk9t6\" (UniqueName: \"kubernetes.io/projected/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-kube-api-access-lk9t6\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-run\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-var-lib-kubelet\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sck5f\" (UniqueName: \"kubernetes.io/projected/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-kube-api-access-sck5f\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-registration-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-host\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-serviceca\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.867328 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b284d533-260c-471d-b97a-e7e5c490b1da-registration-dir\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-host\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/502b7429-f580-4079-ad03-6c6b86f1903f-konnectivity-ca\") pod \"konnectivity-agent-jxxvl\" (UID: \"502b7429-f580-4079-ad03-6c6b86f1903f\") " pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:14.868455 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-env-overrides\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.867450 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:15.367401624 +0000 UTC m=+3.024264835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-os-release\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/28fe626d-8e66-482b-b03e-847bb5829a0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-conf-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysconfig\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-serviceca\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/502b7429-f580-4079-ad03-6c6b86f1903f-agent-certs\") pod \"konnectivity-agent-jxxvl\" (UID: \"502b7429-f580-4079-ad03-6c6b86f1903f\") " pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-tmp\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-kubelet\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-var-lib-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-cni-netd\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867740 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-daemon-config\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-var-lib-openvswitch\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysctl-conf\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-kubelet\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-cni-bin\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-cni-netd\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovnkube-config\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-multus-certs\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0fdfd60-0abc-4f38-bff8-7936432cb97b-host-cni-bin\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-systemd\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.867981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-system-cni-dir\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.868006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z4vt\" (UniqueName: \"kubernetes.io/projected/0cd46437-1e4d-4927-88fe-3d5f18ee621d-kube-api-access-2z4vt\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.868030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-netns\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.868116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28fe626d-8e66-482b-b03e-847bb5829a0b-system-cni-dir\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.869625 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.868853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovnkube-config\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.871672 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.871648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0fdfd60-0abc-4f38-bff8-7936432cb97b-ovn-node-metrics-cert\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.871790 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.871768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/502b7429-f580-4079-ad03-6c6b86f1903f-agent-certs\") pod \"konnectivity-agent-jxxvl\" (UID: \"502b7429-f580-4079-ad03-6c6b86f1903f\") " pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:14.873521 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.873499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjtcn\" (UniqueName: \"kubernetes.io/projected/b284d533-260c-471d-b97a-e7e5c490b1da-kube-api-access-pjtcn\") pod \"aws-ebs-csi-driver-node-wls67\" (UID: \"b284d533-260c-471d-b97a-e7e5c490b1da\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:14.873881 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.873859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7kr4\" (UniqueName: \"kubernetes.io/projected/5cf29621-68bf-43a5-94a8-643b390fca92-kube-api-access-r7kr4\") pod \"node-resolver-xvmfc\" (UID: \"5cf29621-68bf-43a5-94a8-643b390fca92\") " pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:14.873951 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.873885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85497\" (UniqueName: \"kubernetes.io/projected/28fe626d-8e66-482b-b03e-847bb5829a0b-kube-api-access-85497\") pod \"multus-additional-cni-plugins-w4n4s\" (UID: \"28fe626d-8e66-482b-b03e-847bb5829a0b\") " pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:14.874852 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.874832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9n8v\" (UniqueName: \"kubernetes.io/projected/d0fdfd60-0abc-4f38-bff8-7936432cb97b-kube-api-access-g9n8v\") pod \"ovnkube-node-h7zs8\" (UID: \"d0fdfd60-0abc-4f38-bff8-7936432cb97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:14.882146 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.882119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhttb\" (UniqueName: \"kubernetes.io/projected/d7cf70ff-ceb3-4797-94a2-b29fbacd8f78-kube-api-access-rhttb\") pod \"iptables-alerter-fkhld\" (UID: \"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78\") " pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:14.882295 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.882263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk9t6\" (UniqueName: \"kubernetes.io/projected/5a79228b-b4cc-4d96-b9a2-a587214f9a0d-kube-api-access-lk9t6\") pod \"node-ca-ssttk\" (UID: \"5a79228b-b4cc-4d96-b9a2-a587214f9a0d\") " pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:14.882662 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.882646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z4vt\" (UniqueName: \"kubernetes.io/projected/0cd46437-1e4d-4927-88fe-3d5f18ee621d-kube-api-access-2z4vt\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:14.969451 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-lib-modules\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.969630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-host\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.969630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-cnibin\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.969630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-cni-multus\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.969630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-etc-kubernetes\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.969630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-host\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.969630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-etc-kubernetes\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.969630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-lib-modules\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.969931 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-cni-multus\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.969984 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-cnibin\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970159 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.969621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfz2m\" (UniqueName: \"kubernetes.io/projected/70917b3b-92d9-4406-9795-92a9f4be21ea-kube-api-access-cfz2m\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970237 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-tuned\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.970307 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-system-cni-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970365 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-hostroot\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970417 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-cni-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970417 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-system-cni-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-k8s-cni-cncf-io\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970611 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-hostroot\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970611 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-k8s-cni-cncf-io\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970713 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-cni-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970756 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70917b3b-92d9-4406-9795-92a9f4be21ea-cni-binary-copy\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970801 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-modprobe-d\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.970852 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-socket-dir-parent\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970852 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-kubelet\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.970936 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-sys\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.970936 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-run\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971022 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-var-lib-kubelet\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971022 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.970973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sck5f\" (UniqueName: \"kubernetes.io/projected/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-kube-api-access-sck5f\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971108 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-os-release\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971108 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-conf-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971108 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysconfig\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971238 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-tmp\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971238 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-daemon-config\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971238 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysctl-conf\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70917b3b-92d9-4406-9795-92a9f4be21ea-cni-binary-copy\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-multus-certs\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-conf-dir\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-kubelet\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971581 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-os-release\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971581 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysconfig\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971669 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-run\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971718 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-socket-dir-parent\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971718 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-modprobe-d\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971806 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-multus-certs\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971851 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-systemd\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971901 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-netns\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.971901 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-sys\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.971901 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysctl-conf\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.972037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-kubernetes\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.972037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:14.972037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-systemd\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.972037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-cni-bin\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.972037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.971996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-kubernetes\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.972037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.972018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysctl-d\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.972037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.972025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-run-netns\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.972438 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.972063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70917b3b-92d9-4406-9795-92a9f4be21ea-host-var-lib-cni-bin\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.972438 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.972174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-sysctl-d\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.972438 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.972242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-var-lib-kubelet\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.972566 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.972496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70917b3b-92d9-4406-9795-92a9f4be21ea-multus-daemon-config\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.974525 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.974499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-etc-tuned\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.975586 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.975567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-tmp\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:14.976982 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.976959 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:14.976982 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.976985 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:14.977129 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.976999 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f7f7p for pod openshift-network-diagnostics/network-check-target-rw8ct: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:14.977129 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:14.977068 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p podName:b17b42fe-4930-48b1-ac74-5439d9fc893c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:15.477052279 +0000 UTC m=+3.133915486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f7f7p" (UniqueName: "kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p") pod "network-check-target-rw8ct" (UID: "b17b42fe-4930-48b1-ac74-5439d9fc893c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:14.978542 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.978520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfz2m\" (UniqueName: \"kubernetes.io/projected/70917b3b-92d9-4406-9795-92a9f4be21ea-kube-api-access-cfz2m\") pod \"multus-9lt4x\" (UID: \"70917b3b-92d9-4406-9795-92a9f4be21ea\") " pod="openshift-multus/multus-9lt4x" Apr 17 07:52:14.978860 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:14.978834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sck5f\" (UniqueName: \"kubernetes.io/projected/e9a8b24f-6854-4d01-95e6-4bc5d1edd592-kube-api-access-sck5f\") pod \"tuned-cpkz9\" (UID: \"e9a8b24f-6854-4d01-95e6-4bc5d1edd592\") " pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:15.056568 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.056484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" Apr 17 07:52:15.063189 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.063171 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" Apr 17 07:52:15.072396 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.072368 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvmfc" Apr 17 07:52:15.079013 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.078994 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:15.084600 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.084581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:15.091136 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.091115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fkhld" Apr 17 07:52:15.097638 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.097619 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ssttk" Apr 17 07:52:15.104185 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.104168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9lt4x" Apr 17 07:52:15.110788 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.110760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" Apr 17 07:52:15.375503 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.375424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:15.375633 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:15.375539 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:15.375633 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:15.375591 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:16.375577897 +0000 UTC m=+4.032441088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:15.414134 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.414110 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7cf70ff_ceb3_4797_94a2_b29fbacd8f78.slice/crio-49c3109dc781a55a00cd99419c469bce8edb355036e01d9cf4b16b5b758dcaa5 WatchSource:0}: Error finding container 49c3109dc781a55a00cd99419c469bce8edb355036e01d9cf4b16b5b758dcaa5: Status 404 returned error can't find the container with id 49c3109dc781a55a00cd99419c469bce8edb355036e01d9cf4b16b5b758dcaa5 Apr 17 07:52:15.415886 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.415853 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28fe626d_8e66_482b_b03e_847bb5829a0b.slice/crio-3659e1db78afe95d1ee073d909e00e7f4697db3770067e644de61e57ed9e4155 WatchSource:0}: Error finding container 3659e1db78afe95d1ee073d909e00e7f4697db3770067e644de61e57ed9e4155: Status 404 returned error can't find the container with id 3659e1db78afe95d1ee073d909e00e7f4697db3770067e644de61e57ed9e4155 Apr 17 07:52:15.417121 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.417095 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0fdfd60_0abc_4f38_bff8_7936432cb97b.slice/crio-ecd3ecae7c3ca519f89d64a93c8d9b75674374e27c3d48ff842da45ebe547dc0 WatchSource:0}: Error finding container ecd3ecae7c3ca519f89d64a93c8d9b75674374e27c3d48ff842da45ebe547dc0: Status 404 returned error can't find the container with id ecd3ecae7c3ca519f89d64a93c8d9b75674374e27c3d48ff842da45ebe547dc0 Apr 17 07:52:15.422463 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.422441 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cf29621_68bf_43a5_94a8_643b390fca92.slice/crio-f9aeee823e8386413452447a527abcf172061531fa697bd3f391c09a83c2bd57 WatchSource:0}: Error finding container f9aeee823e8386413452447a527abcf172061531fa697bd3f391c09a83c2bd57: Status 404 returned error can't find the container with id f9aeee823e8386413452447a527abcf172061531fa697bd3f391c09a83c2bd57 Apr 17 07:52:15.425493 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.424648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a79228b_b4cc_4d96_b9a2_a587214f9a0d.slice/crio-6392511ba73001d3ad67691b0d7833a1c5ce0439e3bcfd6e41c26176d7f87ad6 WatchSource:0}: Error finding container 6392511ba73001d3ad67691b0d7833a1c5ce0439e3bcfd6e41c26176d7f87ad6: Status 404 returned error can't find the container with id 6392511ba73001d3ad67691b0d7833a1c5ce0439e3bcfd6e41c26176d7f87ad6 Apr 17 07:52:15.425493 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.425191 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod502b7429_f580_4079_ad03_6c6b86f1903f.slice/crio-befeab6d9b24eb09ce029722121c23da78d43e87c8bcf037127fdf4d88407a3e WatchSource:0}: Error finding container befeab6d9b24eb09ce029722121c23da78d43e87c8bcf037127fdf4d88407a3e: Status 404 returned error can't find the container with id befeab6d9b24eb09ce029722121c23da78d43e87c8bcf037127fdf4d88407a3e Apr 17 07:52:15.427088 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.427054 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70917b3b_92d9_4406_9795_92a9f4be21ea.slice/crio-40df8e4ea158d89594116c16622b50f11bba5196566c5d0184a5f0c04cd3d5dc WatchSource:0}: Error finding container 40df8e4ea158d89594116c16622b50f11bba5196566c5d0184a5f0c04cd3d5dc: Status 404 returned error can't find the container with id 40df8e4ea158d89594116c16622b50f11bba5196566c5d0184a5f0c04cd3d5dc Apr 17 07:52:15.428106 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:15.428074 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a8b24f_6854_4d01_95e6_4bc5d1edd592.slice/crio-3df079ad03df8cd96e16c16cb8112f061b598036cc746b0969110d17b2f72b2a WatchSource:0}: Error finding container 3df079ad03df8cd96e16c16cb8112f061b598036cc746b0969110d17b2f72b2a: Status 404 returned error can't find the container with id 3df079ad03df8cd96e16c16cb8112f061b598036cc746b0969110d17b2f72b2a Apr 17 07:52:15.576139 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.576112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:15.576289 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:15.576252 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:15.576289 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:15.576288 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:15.576408 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:15.576301 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f7f7p for pod openshift-network-diagnostics/network-check-target-rw8ct: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:15.576408 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:15.576370 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p podName:b17b42fe-4930-48b1-ac74-5439d9fc893c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:16.576349073 +0000 UTC m=+4.233212266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7f7p" (UniqueName: "kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p") pod "network-check-target-rw8ct" (UID: "b17b42fe-4930-48b1-ac74-5439d9fc893c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:15.796630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.796516 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:47:13 +0000 UTC" deadline="2027-09-17 00:08:06.738094347 +0000 UTC" Apr 17 07:52:15.796630 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.796553 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12424h15m50.941544913s" Apr 17 07:52:15.883552 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.883047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" event={"ID":"9a551efa739e7b700ba0ce69f0532dd7","Type":"ContainerStarted","Data":"94eb9801861a33a604310c0ba88af8d51c4b0aa6ce1d22f2f67ee11d1b765299"} Apr 17 07:52:15.896067 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.896026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jxxvl" event={"ID":"502b7429-f580-4079-ad03-6c6b86f1903f","Type":"ContainerStarted","Data":"befeab6d9b24eb09ce029722121c23da78d43e87c8bcf037127fdf4d88407a3e"} Apr 17 07:52:15.902824 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.902763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" event={"ID":"b284d533-260c-471d-b97a-e7e5c490b1da","Type":"ContainerStarted","Data":"6dc89e5075c8b8b454a1eb3dabfb3d77d03b4768ba0a0f230e6e2ec8538ec6d2"} Apr 17 07:52:15.912725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.912694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvmfc" event={"ID":"5cf29621-68bf-43a5-94a8-643b390fca92","Type":"ContainerStarted","Data":"f9aeee823e8386413452447a527abcf172061531fa697bd3f391c09a83c2bd57"} Apr 17 07:52:15.920230 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.920146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"ecd3ecae7c3ca519f89d64a93c8d9b75674374e27c3d48ff842da45ebe547dc0"} Apr 17 07:52:15.923767 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.923697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" event={"ID":"e9a8b24f-6854-4d01-95e6-4bc5d1edd592","Type":"ContainerStarted","Data":"3df079ad03df8cd96e16c16cb8112f061b598036cc746b0969110d17b2f72b2a"} Apr 17 07:52:15.941389 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.939004 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lt4x" event={"ID":"70917b3b-92d9-4406-9795-92a9f4be21ea","Type":"ContainerStarted","Data":"40df8e4ea158d89594116c16622b50f11bba5196566c5d0184a5f0c04cd3d5dc"} Apr 17 07:52:15.952876 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.952841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ssttk" event={"ID":"5a79228b-b4cc-4d96-b9a2-a587214f9a0d","Type":"ContainerStarted","Data":"6392511ba73001d3ad67691b0d7833a1c5ce0439e3bcfd6e41c26176d7f87ad6"} Apr 17 07:52:15.958127 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.958096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerStarted","Data":"3659e1db78afe95d1ee073d909e00e7f4697db3770067e644de61e57ed9e4155"} Apr 17 07:52:15.968111 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:15.968079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fkhld" event={"ID":"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78","Type":"ContainerStarted","Data":"49c3109dc781a55a00cd99419c469bce8edb355036e01d9cf4b16b5b758dcaa5"} Apr 17 07:52:16.386019 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:16.385268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:16.386019 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.385503 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:16.386019 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.385574 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:18.38555457 +0000 UTC m=+6.042417764 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:16.587689 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:16.587016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:16.587689 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.587200 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:16.587689 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.587219 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:16.587689 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.587232 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f7f7p for pod openshift-network-diagnostics/network-check-target-rw8ct: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:16.587689 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.587303 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p podName:b17b42fe-4930-48b1-ac74-5439d9fc893c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:18.587270213 +0000 UTC m=+6.244133409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7f7p" (UniqueName: "kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p") pod "network-check-target-rw8ct" (UID: "b17b42fe-4930-48b1-ac74-5439d9fc893c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:16.869701 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:16.867447 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:16.869701 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:16.867447 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:16.869701 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.867603 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:16.869701 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:16.867706 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:17.002929 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:17.002886 2576 generic.go:358] "Generic (PLEG): container finished" podID="3bed48d9758f9b62336a701f0447a55a" containerID="0814e4da86d43feb140d695d41d708ce6b287edc5c4e757ef7774644d5320d6a" exitCode=0 Apr 17 07:52:17.003932 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:17.003897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" event={"ID":"3bed48d9758f9b62336a701f0447a55a","Type":"ContainerDied","Data":"0814e4da86d43feb140d695d41d708ce6b287edc5c4e757ef7774644d5320d6a"} Apr 17 07:52:17.018577 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:17.018131 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-165.ec2.internal" podStartSLOduration=4.018113579 podStartE2EDuration="4.018113579s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:15.899326678 +0000 UTC m=+3.556189892" watchObservedRunningTime="2026-04-17 07:52:17.018113579 +0000 UTC m=+4.674976794" Apr 17 07:52:18.012975 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:18.012051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" event={"ID":"3bed48d9758f9b62336a701f0447a55a","Type":"ContainerStarted","Data":"a08ea3ebe1973d9784d504185bf5f9e0bc6f0a37c738d39a15e7f03c74ca8ae7"} Apr 17 07:52:18.030360 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:18.030158 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-165.ec2.internal" podStartSLOduration=5.030138676 podStartE2EDuration="5.030138676s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:18.029042385 +0000 UTC m=+5.685905600" watchObservedRunningTime="2026-04-17 07:52:18.030138676 +0000 UTC m=+5.687001891" Apr 17 07:52:18.404392 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:18.404356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:18.404576 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.404555 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:18.404671 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.404659 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.404638118 +0000 UTC m=+10.061501315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:18.606400 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:18.606364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:18.606553 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.606523 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:18.606553 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.606543 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:18.606642 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.606557 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f7f7p for pod openshift-network-diagnostics/network-check-target-rw8ct: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:18.606642 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.606615 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p podName:b17b42fe-4930-48b1-ac74-5439d9fc893c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.606596783 +0000 UTC m=+10.263459977 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7f7p" (UniqueName: "kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p") pod "network-check-target-rw8ct" (UID: "b17b42fe-4930-48b1-ac74-5439d9fc893c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:18.858936 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:18.858430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:18.858936 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.858576 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:18.858936 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:18.858624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:18.858936 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:18.858723 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:20.858166 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:20.858134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:20.858661 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:20.858287 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:20.858661 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:20.858644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:20.858774 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:20.858742 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:20.988331 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:20.988292 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pmpqp"] Apr 17 07:52:20.992081 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:20.991652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:20.992081 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:20.991728 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:21.026885 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.026706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/85a06191-13b3-4742-8598-8d7237fae7f3-kubelet-config\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.026885 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.026778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.026885 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.026839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/85a06191-13b3-4742-8598-8d7237fae7f3-dbus\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.128268 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.128189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.128268 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.128247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/85a06191-13b3-4742-8598-8d7237fae7f3-dbus\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.128471 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:21.128369 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:21.128471 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.128405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/85a06191-13b3-4742-8598-8d7237fae7f3-dbus\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.128471 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.128411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/85a06191-13b3-4742-8598-8d7237fae7f3-kubelet-config\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.128471 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:21.128439 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret podName:85a06191-13b3-4742-8598-8d7237fae7f3 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:21.628419498 +0000 UTC m=+9.285282696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret") pod "global-pull-secret-syncer-pmpqp" (UID: "85a06191-13b3-4742-8598-8d7237fae7f3") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:21.128649 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.128478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/85a06191-13b3-4742-8598-8d7237fae7f3-kubelet-config\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.632577 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:21.632496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:21.632762 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:21.632624 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:21.632762 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:21.632683 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret podName:85a06191-13b3-4742-8598-8d7237fae7f3 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.63266694 +0000 UTC m=+10.289530139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret") pod "global-pull-secret-syncer-pmpqp" (UID: "85a06191-13b3-4742-8598-8d7237fae7f3") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:22.438889 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:22.438845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:22.439355 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.439031 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:22.439355 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.439095 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:30.439076977 +0000 UTC m=+18.095940173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:22.640358 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:22.640315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:22.640533 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:22.640381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:22.640607 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.640567 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:22.640607 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.640585 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:22.640607 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.640597 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f7f7p for pod openshift-network-diagnostics/network-check-target-rw8ct: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:22.640747 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.640640 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:22.640747 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.640652 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p podName:b17b42fe-4930-48b1-ac74-5439d9fc893c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:30.640633901 +0000 UTC m=+18.297497092 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7f7p" (UniqueName: "kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p") pod "network-check-target-rw8ct" (UID: "b17b42fe-4930-48b1-ac74-5439d9fc893c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:22.640747 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.640707 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret podName:85a06191-13b3-4742-8598-8d7237fae7f3 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:24.640690236 +0000 UTC m=+12.297553430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret") pod "global-pull-secret-syncer-pmpqp" (UID: "85a06191-13b3-4742-8598-8d7237fae7f3") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:22.860059 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:22.859308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:22.860059 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.859410 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:22.860059 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:22.859797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:22.860059 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.859906 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:22.860059 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:22.859957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:22.860059 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:22.860022 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:24.654293 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:24.654187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:24.654735 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:24.654374 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:24.654735 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:24.654471 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret podName:85a06191-13b3-4742-8598-8d7237fae7f3 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:28.654447446 +0000 UTC m=+16.311310647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret") pod "global-pull-secret-syncer-pmpqp" (UID: "85a06191-13b3-4742-8598-8d7237fae7f3") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:24.857707 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:24.857662 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:24.857886 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:24.857779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:24.857886 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:24.857793 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:24.858002 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:24.857898 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:24.858002 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:24.857922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:24.858100 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:24.858009 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:26.860496 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:26.860470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:26.860949 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:26.860472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:26.860949 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:26.860578 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:26.860949 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:26.860687 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:26.860949 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:26.860472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:26.860949 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:26.860838 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:28.682206 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:28.682168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:28.682580 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:28.682364 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:28.682580 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:28.682438 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret podName:85a06191-13b3-4742-8598-8d7237fae7f3 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:36.682418095 +0000 UTC m=+24.339281293 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret") pod "global-pull-secret-syncer-pmpqp" (UID: "85a06191-13b3-4742-8598-8d7237fae7f3") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:28.860395 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:28.860362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:28.860536 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:28.860362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:28.860536 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:28.860473 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:28.860643 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:28.860362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:28.860643 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:28.860564 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:28.860720 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:28.860655 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:30.495110 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:30.495079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:30.495509 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.495227 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:30.495509 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.495305 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:46.495289455 +0000 UTC m=+34.152152658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:30.696356 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:30.696322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:30.696540 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.696462 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:30.696540 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.696485 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:30.696540 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.696500 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f7f7p for pod openshift-network-diagnostics/network-check-target-rw8ct: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:30.696682 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.696560 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p podName:b17b42fe-4930-48b1-ac74-5439d9fc893c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:46.696545931 +0000 UTC m=+34.353409123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7f7p" (UniqueName: "kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p") pod "network-check-target-rw8ct" (UID: "b17b42fe-4930-48b1-ac74-5439d9fc893c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:30.857621 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:30.857589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:30.857806 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:30.857589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:30.857806 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.857704 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:30.857913 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.857805 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:30.857913 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:30.857843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:30.857999 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:30.857909 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:32.858698 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:32.858510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:32.859546 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:32.858610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:32.859546 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:32.858763 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:32.859546 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:32.858849 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:32.859546 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:32.858635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:32.859546 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:32.858922 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:33.037773 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.037655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jxxvl" event={"ID":"502b7429-f580-4079-ad03-6c6b86f1903f","Type":"ContainerStarted","Data":"283464a6f8cbf85b7f2426a9f9bfd3d51ab6a0a6ecd0e4a76a678041c0f28408"} Apr 17 07:52:33.039212 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.039179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" event={"ID":"b284d533-260c-471d-b97a-e7e5c490b1da","Type":"ContainerStarted","Data":"fceb655b00514d2814dbdb1d1fde0a2316c7c31385ec5a0b91d50a2fb18b7e14"} Apr 17 07:52:33.040610 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.040550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvmfc" event={"ID":"5cf29621-68bf-43a5-94a8-643b390fca92","Type":"ContainerStarted","Data":"71633422bddea400e2f61ae34c0e55208154161f31f86d60389995687eb0afb3"} Apr 17 07:52:33.043115 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.042764 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 07:52:33.043248 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.043213 2576 generic.go:358] "Generic (PLEG): container finished" podID="d0fdfd60-0abc-4f38-bff8-7936432cb97b" containerID="6384a3342095b728cb1d3c00d10b657ff0e3a3c147a562566ec5cc9f8093f5e5" exitCode=1 Apr 17 07:52:33.043390 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.043247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"a354d4a6e7dceb102892cb9378fa5d8c55754780ef9a34c7ee15e837e97b9725"} Apr 17 07:52:33.043390 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.043305 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerDied","Data":"6384a3342095b728cb1d3c00d10b657ff0e3a3c147a562566ec5cc9f8093f5e5"} Apr 17 07:52:33.043390 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.043325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"c63eace9de2e1bcde3a87ccfaaf42ac4560b9b929f4471a305f7658359416465"} Apr 17 07:52:33.046694 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.046672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" event={"ID":"e9a8b24f-6854-4d01-95e6-4bc5d1edd592","Type":"ContainerStarted","Data":"16e19b852410aee2c409c85b2e2aab4e64ad1dab645938fdc0f3bf90d135b147"} Apr 17 07:52:33.047906 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.047864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lt4x" event={"ID":"70917b3b-92d9-4406-9795-92a9f4be21ea","Type":"ContainerStarted","Data":"1d679f43193d188d9a592d04f9b188bc3e31030fd98aa19419ae8fa68c14ef32"} Apr 17 07:52:33.049570 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.049552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ssttk" event={"ID":"5a79228b-b4cc-4d96-b9a2-a587214f9a0d","Type":"ContainerStarted","Data":"eb5f659ee6ef7787a48a966bbefeae106467c7693bf67364614286c4a021c14c"} Apr 17 07:52:33.051581 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.051556 2576 generic.go:358] "Generic (PLEG): container finished" podID="28fe626d-8e66-482b-b03e-847bb5829a0b" containerID="75a550b94558afb169d32ff0dc29b3309232e244bd388afca62064caa7649cdb" exitCode=0 Apr 17 07:52:33.051666 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.051595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerDied","Data":"75a550b94558afb169d32ff0dc29b3309232e244bd388afca62064caa7649cdb"} Apr 17 07:52:33.063065 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.063016 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jxxvl" podStartSLOduration=3.170281568 podStartE2EDuration="20.062999357s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.427292133 +0000 UTC m=+3.084155325" lastFinishedPulling="2026-04-17 07:52:32.320009911 +0000 UTC m=+19.976873114" observedRunningTime="2026-04-17 07:52:33.050209951 +0000 UTC m=+20.707073157" watchObservedRunningTime="2026-04-17 07:52:33.062999357 +0000 UTC m=+20.719862571" Apr 17 07:52:33.063229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.063202 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xvmfc" podStartSLOduration=3.168195094 podStartE2EDuration="20.063194259s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.425063476 +0000 UTC m=+3.081926670" lastFinishedPulling="2026-04-17 07:52:32.320062635 +0000 UTC m=+19.976925835" observedRunningTime="2026-04-17 07:52:33.062248086 +0000 UTC m=+20.719111314" watchObservedRunningTime="2026-04-17 07:52:33.063194259 +0000 UTC m=+20.720057471" Apr 17 07:52:33.076752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.076703 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9lt4x" podStartSLOduration=3.042493829 podStartE2EDuration="20.076687466s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.428972134 +0000 UTC m=+3.085835328" lastFinishedPulling="2026-04-17 07:52:32.463165775 +0000 UTC m=+20.120028965" observedRunningTime="2026-04-17 07:52:33.076386012 +0000 UTC m=+20.733249219" watchObservedRunningTime="2026-04-17 07:52:33.076687466 +0000 UTC m=+20.733550680" Apr 17 07:52:33.088495 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.088444 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ssttk" podStartSLOduration=3.194792665 podStartE2EDuration="20.088430221s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.426381633 +0000 UTC m=+3.083244835" lastFinishedPulling="2026-04-17 07:52:32.320019196 +0000 UTC m=+19.976882391" observedRunningTime="2026-04-17 07:52:33.087853821 +0000 UTC m=+20.744717036" watchObservedRunningTime="2026-04-17 07:52:33.088430221 +0000 UTC m=+20.745293435" Apr 17 07:52:33.103454 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.103413 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cpkz9" podStartSLOduration=3.213223714 podStartE2EDuration="20.103399657s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.429842634 +0000 UTC m=+3.086705832" lastFinishedPulling="2026-04-17 07:52:32.320018572 +0000 UTC m=+19.976881775" observedRunningTime="2026-04-17 07:52:33.102860093 +0000 UTC m=+20.759723304" watchObservedRunningTime="2026-04-17 07:52:33.103399657 +0000 UTC m=+20.760262902" Apr 17 07:52:33.633574 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.633546 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:52:33.833974 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.833868 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:52:33.63356732Z","UUID":"14311a05-f9f6-4632-b52f-1ec023c4f5dd","Handler":null,"Name":"","Endpoint":""} Apr 17 07:52:33.837361 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.837337 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:52:33.837481 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:33.837368 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:52:34.055464 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.055130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fkhld" event={"ID":"d7cf70ff-ceb3-4797-94a2-b29fbacd8f78","Type":"ContainerStarted","Data":"d34c66ba16c0cb48418e054a0eda6a6a50dc1f11fa73c57244e317b173597955"} Apr 17 07:52:34.057117 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.057035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" event={"ID":"b284d533-260c-471d-b97a-e7e5c490b1da","Type":"ContainerStarted","Data":"a480ccf8d35ad2207b95bc1368b26d0b8f3a770251696e969103d32bf1c424d7"} Apr 17 07:52:34.066500 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.066444 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fkhld" podStartSLOduration=4.162515178 podStartE2EDuration="21.066428513s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.416123881 +0000 UTC m=+3.072987075" lastFinishedPulling="2026-04-17 07:52:32.320037212 +0000 UTC m=+19.976900410" observedRunningTime="2026-04-17 07:52:34.066266338 +0000 UTC m=+21.723129550" watchObservedRunningTime="2026-04-17 07:52:34.066428513 +0000 UTC m=+21.723291721" Apr 17 07:52:34.067998 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.067981 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 07:52:34.068470 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.068443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"117fe7842f7a24ed95341b417ff981142a155b85f0240e0c4f892a718ad055f9"} Apr 17 07:52:34.068563 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.068479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"54e396459d40dc7bb86d29028c28ed0fa485122839fcc8da27a98d6030133a09"} Apr 17 07:52:34.068563 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.068490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"f849daa522f484e69c400fe2594e70ee9a185677d7c8426dfd33a67aac1be801"} Apr 17 07:52:34.857650 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.857618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:34.857829 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.857618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:34.857829 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:34.857731 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:34.857946 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:34.857826 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:34.857946 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:34.857627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:34.857946 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:34.857895 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:35.072702 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:35.072609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" event={"ID":"b284d533-260c-471d-b97a-e7e5c490b1da","Type":"ContainerStarted","Data":"509f6566eb6ee18275975a9300691b997f4f9c2579684ad0a8ad76637b38e90a"} Apr 17 07:52:35.087735 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:35.087687 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wls67" podStartSLOduration=3.804725486 podStartE2EDuration="23.087672737s" podCreationTimestamp="2026-04-17 07:52:12 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.423923532 +0000 UTC m=+3.080786724" lastFinishedPulling="2026-04-17 07:52:34.706870784 +0000 UTC m=+22.363733975" observedRunningTime="2026-04-17 07:52:35.086891805 +0000 UTC m=+22.743755017" watchObservedRunningTime="2026-04-17 07:52:35.087672737 +0000 UTC m=+22.744535949" Apr 17 07:52:36.077634 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.077607 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 07:52:36.078213 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.078021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"d13764c8741b35e734433b8eb9edd7dbee481e81989e5008dd39eccaa2346d9f"} Apr 17 07:52:36.719355 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.719317 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:36.720392 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.720369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:36.736962 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.736929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:36.737107 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:36.737055 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:36.737172 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:36.737118 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret podName:85a06191-13b3-4742-8598-8d7237fae7f3 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:52.737100365 +0000 UTC m=+40.393963556 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret") pod "global-pull-secret-syncer-pmpqp" (UID: "85a06191-13b3-4742-8598-8d7237fae7f3") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:36.858328 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.858295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:36.858492 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.858295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:36.858492 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:36.858411 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:36.858492 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:36.858470 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:36.858492 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:36.858295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:36.858660 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:36.858562 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:37.079751 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:37.079675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:37.080270 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:37.080138 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jxxvl" Apr 17 07:52:38.083634 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.083380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 07:52:38.084072 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.083868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"24ece8049f613455f6b9fe065239228bdadc5d26ec4af1b809844b07a401674f"} Apr 17 07:52:38.084230 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.084186 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:38.084430 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.084411 2576 scope.go:117] "RemoveContainer" containerID="6384a3342095b728cb1d3c00d10b657ff0e3a3c147a562566ec5cc9f8093f5e5" Apr 17 07:52:38.085397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.085375 2576 generic.go:358] "Generic (PLEG): container finished" podID="28fe626d-8e66-482b-b03e-847bb5829a0b" containerID="c9fe3f3a17972753c8b4c2130d08c3b4672ffdb5e500e133dfb2badf7942f8f8" exitCode=0 Apr 17 07:52:38.085478 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.085457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerDied","Data":"c9fe3f3a17972753c8b4c2130d08c3b4672ffdb5e500e133dfb2badf7942f8f8"} Apr 17 07:52:38.100311 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.100290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:38.858157 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.858118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:38.858369 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.858118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:38.858369 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:38.858258 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:38.858369 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:38.858118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:38.858369 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:38.858315 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:38.858561 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:38.858402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:39.089393 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.089361 2576 generic.go:358] "Generic (PLEG): container finished" podID="28fe626d-8e66-482b-b03e-847bb5829a0b" containerID="20cb7aa973d926e511b98f254a1b04046788f6953d01977d9f1a771e2838a342" exitCode=0 Apr 17 07:52:39.089797 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.089454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerDied","Data":"20cb7aa973d926e511b98f254a1b04046788f6953d01977d9f1a771e2838a342"} Apr 17 07:52:39.093122 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.093104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 07:52:39.093483 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.093457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" event={"ID":"d0fdfd60-0abc-4f38-bff8-7936432cb97b","Type":"ContainerStarted","Data":"d6691cbcda80aaff3668440d127575dace12ae718544ae43b6312598a400828d"} Apr 17 07:52:39.094045 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.094022 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:52:39.097910 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.094525 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:39.109998 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.109978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:39.140566 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.140524 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" podStartSLOduration=9.164369893 podStartE2EDuration="26.140508501s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.419026664 +0000 UTC m=+3.075889855" lastFinishedPulling="2026-04-17 07:52:32.395165259 +0000 UTC m=+20.052028463" observedRunningTime="2026-04-17 07:52:39.13917699 +0000 UTC m=+26.796040203" watchObservedRunningTime="2026-04-17 07:52:39.140508501 +0000 UTC m=+26.797371715" Apr 17 07:52:39.576354 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.576096 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pmpqp"] Apr 17 07:52:39.576502 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.576434 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:39.576571 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:39.576547 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:39.578682 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.578660 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rw8ct"] Apr 17 07:52:39.578779 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.578753 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:39.578844 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:39.578816 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:39.587262 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.587238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wxvbl"] Apr 17 07:52:39.587374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:39.587350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:39.587448 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:39.587428 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:40.097239 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:40.097202 2576 generic.go:358] "Generic (PLEG): container finished" podID="28fe626d-8e66-482b-b03e-847bb5829a0b" containerID="64c14b71b5fd6c2abcf9abe1ca7e057f28d560722ff75d4ffc923ffcc02144aa" exitCode=0 Apr 17 07:52:40.097716 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:40.097300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerDied","Data":"64c14b71b5fd6c2abcf9abe1ca7e057f28d560722ff75d4ffc923ffcc02144aa"} Apr 17 07:52:40.097716 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:40.097505 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:52:40.678993 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:40.678953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:52:40.858387 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:40.858345 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:40.858558 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:40.858480 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:41.858168 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:41.858133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:41.858721 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:41.858133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:41.858721 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:41.858288 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:41.858721 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:41.858348 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:42.113848 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:42.113788 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" podUID="d0fdfd60-0abc-4f38-bff8-7936432cb97b" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 07:52:42.858385 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:42.858357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:42.859065 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:42.858499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:43.858540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:43.858505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:43.859111 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:43.858515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:43.859111 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:43.858620 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pmpqp" podUID="85a06191-13b3-4742-8598-8d7237fae7f3" Apr 17 07:52:43.859111 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:43.858732 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:52:44.858689 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:44.858651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:44.859183 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:44.858784 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rw8ct" podUID="b17b42fe-4930-48b1-ac74-5439d9fc893c" Apr 17 07:52:45.182162 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.182078 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-165.ec2.internal" event="NodeReady" Apr 17 07:52:45.182368 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.182222 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:52:45.215096 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.215050 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-529rs"] Apr 17 07:52:45.243540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.243510 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-bc6b6b8b5-xzvk9"] Apr 17 07:52:45.243768 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.243749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:45.246085 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.246031 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 07:52:45.246217 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.246143 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-lp56s\"" Apr 17 07:52:45.246217 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.246161 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 07:52:45.258642 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.258615 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6f2m8"] Apr 17 07:52:45.258754 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.258723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.261292 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.261258 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b7smv\"" Apr 17 07:52:45.261292 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.261270 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:52:45.261447 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.261319 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:52:45.261447 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.261357 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:52:45.266589 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.266566 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:52:45.276647 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.276625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-529rs"] Apr 17 07:52:45.276647 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.276651 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9w6bd"] Apr 17 07:52:45.276800 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.276773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.280344 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.279209 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:52:45.280344 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.279402 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:52:45.280344 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.279578 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d4l2f\"" Apr 17 07:52:45.295224 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.295199 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bc6b6b8b5-xzvk9"] Apr 17 07:52:45.295364 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.295229 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6f2m8"] Apr 17 07:52:45.295364 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.295242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9w6bd"] Apr 17 07:52:45.295364 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.295344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:45.297639 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.297582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:52:45.297639 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.297600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:52:45.297639 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.297623 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:52:45.297836 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.297586 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fmxgb\"" Apr 17 07:52:45.404655 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.404655 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-registry-certificates\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.404655 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-bound-sa-token\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.404886 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427pm\" (UniqueName: \"kubernetes.io/projected/b1a369df-257c-47a4-96da-3025f897b1dd-kube-api-access-427pm\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.404886 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72a398ea-84ce-463a-aa82-659a22cc916a-ca-trust-extracted\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.404886 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404794 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a369df-257c-47a4-96da-3025f897b1dd-config-volume\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.404886 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:45.404886 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-trusted-ca\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.405039 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-installation-pull-secrets\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.405039 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.405039 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l798c\" (UniqueName: \"kubernetes.io/projected/8ff81f11-f2e2-4838-a775-e57edc28571c-kube-api-access-l798c\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:45.405039 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49rsw\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-kube-api-access-49rsw\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.405039 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.404994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1a369df-257c-47a4-96da-3025f897b1dd-tmp-dir\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.405039 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.405011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:45.405039 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.405027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a34eada-e251-4bc7-8937-8f933c0cbd6f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:45.405244 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.405048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-image-registry-private-configuration\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.506400 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.506400 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-registry-certificates\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.506400 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-bound-sa-token\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-427pm\" (UniqueName: \"kubernetes.io/projected/b1a369df-257c-47a4-96da-3025f897b1dd-kube-api-access-427pm\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72a398ea-84ce-463a-aa82-659a22cc916a-ca-trust-extracted\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a369df-257c-47a4-96da-3025f897b1dd-config-volume\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.506464 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.506484 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.506576 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.506592 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:52:46.006571023 +0000 UTC m=+33.663434220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:52:45.506650 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.506634 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:46.006617717 +0000 UTC m=+33.663480918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-trusted-ca\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-installation-pull-secrets\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l798c\" (UniqueName: \"kubernetes.io/projected/8ff81f11-f2e2-4838-a775-e57edc28571c-kube-api-access-l798c\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49rsw\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-kube-api-access-49rsw\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1a369df-257c-47a4-96da-3025f897b1dd-tmp-dir\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a34eada-e251-4bc7-8937-8f933c0cbd6f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.506890 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-image-registry-private-configuration\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.506948 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:52:46.00693369 +0000 UTC m=+33.663796894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.507015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-registry-certificates\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.507050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a369df-257c-47a4-96da-3025f897b1dd-config-volume\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.506800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72a398ea-84ce-463a-aa82-659a22cc916a-ca-trust-extracted\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.507101 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.507097 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:45.507781 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:45.507146 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:46.00713254 +0000 UTC m=+33.663995730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:52:45.507781 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.507245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1a369df-257c-47a4-96da-3025f897b1dd-tmp-dir\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.507781 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.507577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a34eada-e251-4bc7-8937-8f933c0cbd6f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:45.507781 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.507595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-trusted-ca\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.510479 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.510457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-image-registry-private-configuration\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.510577 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.510481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-installation-pull-secrets\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.518152 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.517992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-bound-sa-token\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.518225 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.518080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-427pm\" (UniqueName: \"kubernetes.io/projected/b1a369df-257c-47a4-96da-3025f897b1dd-kube-api-access-427pm\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:45.518352 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.518336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l798c\" (UniqueName: \"kubernetes.io/projected/8ff81f11-f2e2-4838-a775-e57edc28571c-kube-api-access-l798c\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:45.518675 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.518659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49rsw\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-kube-api-access-49rsw\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:45.858119 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.858093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:45.858260 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.858096 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:45.860899 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.860877 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jkb85\"" Apr 17 07:52:45.861208 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.860918 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:45.861208 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:45.860919 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:52:46.011467 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.011406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:46.011590 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.011469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:46.011590 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.011537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:46.011590 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.011578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:46.011738 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011576 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:46.011738 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011695 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:52:46.011738 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011600 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:52:46.011921 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011639 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:46.011921 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011755 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:52:47.011735043 +0000 UTC m=+34.668598240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:52:46.011921 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011652 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:46.011921 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011772 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:47.011762747 +0000 UTC m=+34.668625941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:52:46.011921 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011787 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:52:47.011779924 +0000 UTC m=+34.668643119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:52:46.011921 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.011799 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:47.011792758 +0000 UTC m=+34.668655951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:52:46.111958 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.111929 2576 generic.go:358] "Generic (PLEG): container finished" podID="28fe626d-8e66-482b-b03e-847bb5829a0b" containerID="36efef95bc6334508af48ac0b404cc4970fbc05182b28d39160315b150ddd261" exitCode=0 Apr 17 07:52:46.111958 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.111968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerDied","Data":"36efef95bc6334508af48ac0b404cc4970fbc05182b28d39160315b150ddd261"} Apr 17 07:52:46.515685 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.515629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:52:46.515803 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.515771 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:46.515858 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.515842 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:53:18.515818442 +0000 UTC m=+66.172681633 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : secret "metrics-daemon-secret" not found Apr 17 07:52:46.716779 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.716735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:46.716998 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.716863 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:46.716998 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.716877 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:46.716998 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.716887 2576 projected.go:194] Error preparing data for projected volume kube-api-access-f7f7p for pod openshift-network-diagnostics/network-check-target-rw8ct: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:46.716998 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:46.716934 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p podName:b17b42fe-4930-48b1-ac74-5439d9fc893c nodeName:}" failed. No retries permitted until 2026-04-17 07:53:18.716920797 +0000 UTC m=+66.373783989 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-f7f7p" (UniqueName: "kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p") pod "network-check-target-rw8ct" (UID: "b17b42fe-4930-48b1-ac74-5439d9fc893c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:46.858435 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.858404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:52:46.861153 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.861126 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:46.861153 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.861145 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n45c2\"" Apr 17 07:52:46.861592 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:46.861125 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:47.019997 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:47.019686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:47.019997 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:47.019888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:47.019997 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:47.019953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:47.020311 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:47.020035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:47.020311 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.019898 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:47.020428 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.020353 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:49.020333873 +0000 UTC m=+36.677197068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:52:47.022088 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.019973 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:47.022088 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.020528 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:52:47.022088 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.020587 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:52:49.020565133 +0000 UTC m=+36.677428344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:52:47.022088 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.020186 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:47.022088 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.020633 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:52:49.020617337 +0000 UTC m=+36.677480539 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:52:47.022088 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.020259 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:52:47.022088 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:47.020672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:49.020658382 +0000 UTC m=+36.677521584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:52:47.116137 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:47.116045 2576 generic.go:358] "Generic (PLEG): container finished" podID="28fe626d-8e66-482b-b03e-847bb5829a0b" containerID="e67380eb1a9f9bc7220204c9163217b3728589b20658578c5b17cc6a438d2620" exitCode=0 Apr 17 07:52:47.116137 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:47.116115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerDied","Data":"e67380eb1a9f9bc7220204c9163217b3728589b20658578c5b17cc6a438d2620"} Apr 17 07:52:48.120742 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:48.120706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" event={"ID":"28fe626d-8e66-482b-b03e-847bb5829a0b","Type":"ContainerStarted","Data":"7d5b3c6dc345120f6629f3d41817f78b3a43f21e91d7ec454af31fada50d216a"} Apr 17 07:52:48.144453 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:48.144405 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w4n4s" podStartSLOduration=5.805330888 podStartE2EDuration="36.144391664s" podCreationTimestamp="2026-04-17 07:52:12 +0000 UTC" firstStartedPulling="2026-04-17 07:52:15.41810817 +0000 UTC m=+3.074971365" lastFinishedPulling="2026-04-17 07:52:45.757168943 +0000 UTC m=+33.414032141" observedRunningTime="2026-04-17 07:52:48.143793984 +0000 UTC m=+35.800657196" watchObservedRunningTime="2026-04-17 07:52:48.144391664 +0000 UTC m=+35.801254876" Apr 17 07:52:49.034480 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:49.034441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:49.034653 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:49.034494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:49.034653 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:49.034519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:49.034653 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:49.034563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:49.034653 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034593 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:52:49.034653 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034639 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:49.034653 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034649 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:49.034653 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034659 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:52:49.034885 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034674 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:49.034885 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034677 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.034655566 +0000 UTC m=+40.691518757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:52:49.034885 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034741 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.034724593 +0000 UTC m=+40.691587783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:52:49.034885 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034756 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.034747524 +0000 UTC m=+40.691610714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:52:49.034885 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:49.034780 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.034771458 +0000 UTC m=+40.691634652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:52:52.766012 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:52.765819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:52.769602 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:52.769582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/85a06191-13b3-4742-8598-8d7237fae7f3-original-pull-secret\") pod \"global-pull-secret-syncer-pmpqp\" (UID: \"85a06191-13b3-4742-8598-8d7237fae7f3\") " pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:53.067591 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:53.067482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:52:53.067591 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:53.067539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:52:53.067591 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:53.067586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:53.067612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067652 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067692 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067699 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067716 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067725 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:01.067708404 +0000 UTC m=+48.724571598 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067704 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067739 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:53:01.067733265 +0000 UTC m=+48.724596455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067772 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:53:01.067759833 +0000 UTC m=+48.724623023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:52:53.067839 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:52:53.067781 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:53:01.067776012 +0000 UTC m=+48.724639202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:52:53.069506 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:53.069488 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pmpqp" Apr 17 07:52:53.190032 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:53.190003 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pmpqp"] Apr 17 07:52:53.193332 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:52:53.193299 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85a06191_13b3_4742_8598_8d7237fae7f3.slice/crio-6dd6a40ec1c545dac637d8dab6a26d4e06472bdb1b921265137730fa597bd795 WatchSource:0}: Error finding container 6dd6a40ec1c545dac637d8dab6a26d4e06472bdb1b921265137730fa597bd795: Status 404 returned error can't find the container with id 6dd6a40ec1c545dac637d8dab6a26d4e06472bdb1b921265137730fa597bd795 Apr 17 07:52:54.133352 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:54.133310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pmpqp" event={"ID":"85a06191-13b3-4742-8598-8d7237fae7f3","Type":"ContainerStarted","Data":"6dd6a40ec1c545dac637d8dab6a26d4e06472bdb1b921265137730fa597bd795"} Apr 17 07:52:58.142833 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:58.142796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pmpqp" event={"ID":"85a06191-13b3-4742-8598-8d7237fae7f3","Type":"ContainerStarted","Data":"1d6afd27b41198bad442f10cd13111ea43ac66fd0bd10c142e7e8338af07d7e8"} Apr 17 07:52:58.156500 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:52:58.156457 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pmpqp" podStartSLOduration=34.288656667 podStartE2EDuration="38.156443707s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:53.194943727 +0000 UTC m=+40.851806917" lastFinishedPulling="2026-04-17 07:52:57.062730766 +0000 UTC m=+44.719593957" observedRunningTime="2026-04-17 07:52:58.155512134 +0000 UTC m=+45.812375370" watchObservedRunningTime="2026-04-17 07:52:58.156443707 +0000 UTC m=+45.813306921" Apr 17 07:53:01.133599 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:01.133565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:01.133611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:01.133633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:01.133673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133734 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133791 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133804 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:53:17.133785181 +0000 UTC m=+64.790648376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133793 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133843 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:17.133831004 +0000 UTC m=+64.790694200 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133794 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133857 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:53:17.133850384 +0000 UTC m=+64.790713579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133869 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:53:01.134022 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:01.133928 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:53:17.13391434 +0000 UTC m=+64.790777545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:53:12.111769 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:12.111742 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7zs8" Apr 17 07:53:17.151788 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:17.151726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:53:17.151788 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:17.151795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:17.151843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:17.151877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.151890 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.151913 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.151961 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:53:49.151944854 +0000 UTC m=+96.808808045 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.151969 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.151985 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.151999 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.152042 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:49.152024559 +0000 UTC m=+96.808887761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.152058 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:53:49.15205062 +0000 UTC m=+96.808913811 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:53:17.152345 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:17.152069 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:53:49.152063192 +0000 UTC m=+96.808926384 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:53:18.563122 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:18.563086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:53:18.563591 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:18.563230 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:18.563591 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:18.563309 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:54:22.563294257 +0000 UTC m=+130.220157462 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : secret "metrics-daemon-secret" not found Apr 17 07:53:18.765659 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:18.765614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:53:18.768611 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:18.768591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:53:18.779215 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:18.779195 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:53:18.789644 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:18.789620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7f7p\" (UniqueName: \"kubernetes.io/projected/b17b42fe-4930-48b1-ac74-5439d9fc893c-kube-api-access-f7f7p\") pod \"network-check-target-rw8ct\" (UID: \"b17b42fe-4930-48b1-ac74-5439d9fc893c\") " pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:53:18.969786 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:18.969755 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n45c2\"" Apr 17 07:53:18.977591 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:18.977571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:53:19.110950 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:19.110922 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rw8ct"] Apr 17 07:53:19.113795 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:53:19.113765 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb17b42fe_4930_48b1_ac74_5439d9fc893c.slice/crio-cc3f79585b77f0bf2309699e9d41a3289ea91a6ce9e6237532feb5c01a3c38e3 WatchSource:0}: Error finding container cc3f79585b77f0bf2309699e9d41a3289ea91a6ce9e6237532feb5c01a3c38e3: Status 404 returned error can't find the container with id cc3f79585b77f0bf2309699e9d41a3289ea91a6ce9e6237532feb5c01a3c38e3 Apr 17 07:53:19.183357 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:19.183318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rw8ct" event={"ID":"b17b42fe-4930-48b1-ac74-5439d9fc893c","Type":"ContainerStarted","Data":"cc3f79585b77f0bf2309699e9d41a3289ea91a6ce9e6237532feb5c01a3c38e3"} Apr 17 07:53:22.190236 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:22.190151 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rw8ct" event={"ID":"b17b42fe-4930-48b1-ac74-5439d9fc893c","Type":"ContainerStarted","Data":"9b94c5c5b3d3f16764636805fbbbb82fe623de73798703f5d550ab972cadd750"} Apr 17 07:53:22.190609 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:22.190265 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:53:22.204127 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:22.204077 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rw8ct" podStartSLOduration=66.39021864 podStartE2EDuration="1m9.204063154s" podCreationTimestamp="2026-04-17 07:52:13 +0000 UTC" firstStartedPulling="2026-04-17 07:53:19.115581145 +0000 UTC m=+66.772444336" lastFinishedPulling="2026-04-17 07:53:21.929425659 +0000 UTC m=+69.586288850" observedRunningTime="2026-04-17 07:53:22.203079049 +0000 UTC m=+69.859942273" watchObservedRunningTime="2026-04-17 07:53:22.204063154 +0000 UTC m=+69.860926361" Apr 17 07:53:49.179732 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:49.179692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179772 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:49.179784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:49.179816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179827 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:53.17981303 +0000 UTC m=+160.836676221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179879 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179902 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179916 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:54:53.179903747 +0000 UTC m=+160.836766939 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179917 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:49.179933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179941 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:53.179934712 +0000 UTC m=+160.836797903 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.179986 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:49.180104 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:53:49.180005 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:54:53.179999061 +0000 UTC m=+160.836862251 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:53:53.197968 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:53:53.197941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rw8ct" Apr 17 07:54:22.629631 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:22.629594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:54:22.630112 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:22.629709 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:54:22.630112 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:22.629769 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs podName:0cd46437-1e4d-4927-88fe-3d5f18ee621d nodeName:}" failed. No retries permitted until 2026-04-17 07:56:24.629754146 +0000 UTC m=+252.286617336 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs") pod "network-metrics-daemon-wxvbl" (UID: "0cd46437-1e4d-4927-88fe-3d5f18ee621d") : secret "metrics-daemon-secret" not found Apr 17 07:54:43.770133 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.770091 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn"] Apr 17 07:54:43.772779 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.772763 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b"] Apr 17 07:54:43.772922 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.772903 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.775141 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.775121 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zrcx8"] Apr 17 07:54:43.775338 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.775241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.777356 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.777336 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.777539 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.777358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 07:54:43.777642 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.777614 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bvdcl"] Apr 17 07:54:43.777757 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.777715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.780641 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.779199 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.782524 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.782497 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7vp6c\"" Apr 17 07:54:43.782776 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.782530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f854e39-db34-4c83-8d33-c1d1898b7133-config\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9177b8-0879-4607-8085-b87914bfa611-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-89tkw\"" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783728 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28knr\" (UniqueName: \"kubernetes.io/projected/ef9177b8-0879-4607-8085-b87914bfa611-kube-api-access-28knr\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5dhd\" (UniqueName: \"kubernetes.io/projected/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-kube-api-access-w5dhd\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f854e39-db34-4c83-8d33-c1d1898b7133-serving-cert\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.783952 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.784469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.783981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f854e39-db34-4c83-8d33-c1d1898b7133-trusted-ca\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.784469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kddp\" (UniqueName: \"kubernetes.io/projected/5f854e39-db34-4c83-8d33-c1d1898b7133-kube-api-access-9kddp\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.784469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9177b8-0879-4607-8085-b87914bfa611-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.784469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 07:54:43.784469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784331 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.784469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-tz7sd\"" Apr 17 07:54:43.784756 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn"] Apr 17 07:54:43.784756 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.784756 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 07:54:43.784756 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 07:54:43.784944 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 07:54:43.785290 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.784995 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.789163 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.789141 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.789267 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.789169 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.789267 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.789212 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.791206 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.791188 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 07:54:43.791691 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.791671 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 07:54:43.793402 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.793381 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-v7vhf\"" Apr 17 07:54:43.793846 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.793826 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 07:54:43.795797 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.795779 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b"] Apr 17 07:54:43.796054 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.796035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 07:54:43.805388 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.805364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zrcx8"] Apr 17 07:54:43.811788 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.811766 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bvdcl"] Apr 17 07:54:43.879560 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.879527 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48"] Apr 17 07:54:43.882623 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.882600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.884420 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f854e39-db34-4c83-8d33-c1d1898b7133-trusted-ca\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.884515 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58646635-9ae5-4468-b026-e2e262f7810c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.884515 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9177b8-0879-4607-8085-b87914bfa611-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.884584 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f854e39-db34-4c83-8d33-c1d1898b7133-serving-cert\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.884650 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.884708 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9177b8-0879-4607-8085-b87914bfa611-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.884760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28knr\" (UniqueName: \"kubernetes.io/projected/ef9177b8-0879-4607-8085-b87914bfa611-kube-api-access-28knr\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.884760 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjmfk\" (UniqueName: \"kubernetes.io/projected/83abce3c-9745-4587-a4d0-fc4d481c1c19-kube-api-access-qjmfk\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.884853 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884771 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58646635-9ae5-4468-b026-e2e262f7810c-serving-cert\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.884853 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:43.884775 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:43.884853 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:43.884839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls podName:b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be nodeName:}" failed. No retries permitted until 2026-04-17 07:54:44.384820683 +0000 UTC m=+152.041683889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xt24b" (UID: "b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:43.885001 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.885001 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/58646635-9ae5-4468-b026-e2e262f7810c-snapshots\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.885001 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmlk\" (UniqueName: \"kubernetes.io/projected/58646635-9ae5-4468-b026-e2e262f7810c-kube-api-access-sxmlk\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.885001 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.884995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9177b8-0879-4607-8085-b87914bfa611-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.885196 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58646635-9ae5-4468-b026-e2e262f7810c-service-ca-bundle\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.885196 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kddp\" (UniqueName: \"kubernetes.io/projected/5f854e39-db34-4c83-8d33-c1d1898b7133-kube-api-access-9kddp\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.885196 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5dhd\" (UniqueName: \"kubernetes.io/projected/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-kube-api-access-w5dhd\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.885196 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f854e39-db34-4c83-8d33-c1d1898b7133-config\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.885421 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83abce3c-9745-4587-a4d0-fc4d481c1c19-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.885421 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58646635-9ae5-4468-b026-e2e262f7810c-tmp\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.885421 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83abce3c-9745-4587-a4d0-fc4d481c1c19-config\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.885565 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f854e39-db34-4c83-8d33-c1d1898b7133-trusted-ca\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.885739 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.885719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f854e39-db34-4c83-8d33-c1d1898b7133-config\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.886031 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.886007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.887129 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.887109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f854e39-db34-4c83-8d33-c1d1898b7133-serving-cert\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.887259 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.887244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9177b8-0879-4607-8085-b87914bfa611-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.889453 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.889421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 07:54:43.889547 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.889470 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.889547 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.889509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.889547 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.889531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-s8r9r\"" Apr 17 07:54:43.889547 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.889537 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 07:54:43.902073 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.902046 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48"] Apr 17 07:54:43.918745 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.918721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5dhd\" (UniqueName: \"kubernetes.io/projected/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-kube-api-access-w5dhd\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:43.919239 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.919221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28knr\" (UniqueName: \"kubernetes.io/projected/ef9177b8-0879-4607-8085-b87914bfa611-kube-api-access-28knr\") pod \"kube-storage-version-migrator-operator-6769c5d45-fn8dn\" (UID: \"ef9177b8-0879-4607-8085-b87914bfa611\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:43.923187 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.923165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kddp\" (UniqueName: \"kubernetes.io/projected/5f854e39-db34-4c83-8d33-c1d1898b7133-kube-api-access-9kddp\") pod \"console-operator-9d4b6777b-zrcx8\" (UID: \"5f854e39-db34-4c83-8d33-c1d1898b7133\") " pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:43.966700 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.966666 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7bcb9d645d-ktdgl"] Apr 17 07:54:43.970766 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.970748 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:43.975627 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.975437 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 07:54:43.975627 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.975581 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.975822 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.975808 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 07:54:43.975909 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.975882 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 07:54:43.976037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.975999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-b9kgr\"" Apr 17 07:54:43.976037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.975810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 07:54:43.976152 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.976134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.983754 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.983732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bcb9d645d-ktdgl"] Apr 17 07:54:43.986368 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83abce3c-9745-4587-a4d0-fc4d481c1c19-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.986485 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-default-certificate\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:43.986485 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58646635-9ae5-4468-b026-e2e262f7810c-tmp\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.986485 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83abce3c-9745-4587-a4d0-fc4d481c1c19-config\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.986485 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqkd\" (UniqueName: \"kubernetes.io/projected/999368dd-92dc-4226-a15b-73baf3d1e08a-kube-api-access-6mqkd\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:43.986686 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58646635-9ae5-4468-b026-e2e262f7810c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.986686 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjmfk\" (UniqueName: \"kubernetes.io/projected/83abce3c-9745-4587-a4d0-fc4d481c1c19-kube-api-access-qjmfk\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.986686 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58646635-9ae5-4468-b026-e2e262f7810c-serving-cert\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.986686 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/58646635-9ae5-4468-b026-e2e262f7810c-snapshots\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.986686 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:43.986686 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmlk\" (UniqueName: \"kubernetes.io/projected/58646635-9ae5-4468-b026-e2e262f7810c-kube-api-access-sxmlk\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.986967 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-stats-auth\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:43.986967 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:43.986967 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.986752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58646635-9ae5-4468-b026-e2e262f7810c-service-ca-bundle\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.987129 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.987041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83abce3c-9745-4587-a4d0-fc4d481c1c19-config\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.987181 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.987146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58646635-9ae5-4468-b026-e2e262f7810c-tmp\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.987428 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.987403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58646635-9ae5-4468-b026-e2e262f7810c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.987523 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.987457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/58646635-9ae5-4468-b026-e2e262f7810c-snapshots\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.987523 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.987469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58646635-9ae5-4468-b026-e2e262f7810c-service-ca-bundle\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.989060 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.989018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83abce3c-9745-4587-a4d0-fc4d481c1c19-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.989226 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.989200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58646635-9ae5-4468-b026-e2e262f7810c-serving-cert\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:43.995772 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.995745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjmfk\" (UniqueName: \"kubernetes.io/projected/83abce3c-9745-4587-a4d0-fc4d481c1c19-kube-api-access-qjmfk\") pod \"service-ca-operator-d6fc45fc5-4dn48\" (UID: \"83abce3c-9745-4587-a4d0-fc4d481c1c19\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:43.995897 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:43.995800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmlk\" (UniqueName: \"kubernetes.io/projected/58646635-9ae5-4468-b026-e2e262f7810c-kube-api-access-sxmlk\") pod \"insights-operator-585dfdc468-bvdcl\" (UID: \"58646635-9ae5-4468-b026-e2e262f7810c\") " pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:44.085819 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.085713 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" Apr 17 07:54:44.087721 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.087692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqkd\" (UniqueName: \"kubernetes.io/projected/999368dd-92dc-4226-a15b-73baf3d1e08a-kube-api-access-6mqkd\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.087850 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.087801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.087850 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.087834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-stats-auth\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.087953 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.087869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.087953 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.087925 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:44.088056 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.088002 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:44.587982614 +0000 UTC m=+152.244845809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : secret "router-metrics-certs-default" not found Apr 17 07:54:44.088056 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.087929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-default-certificate\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.088056 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.088050 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:44.588038235 +0000 UTC m=+152.244901441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:44.090342 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.090323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-default-certificate\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.090442 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.090363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-stats-auth\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.100077 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.100050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqkd\" (UniqueName: \"kubernetes.io/projected/999368dd-92dc-4226-a15b-73baf3d1e08a-kube-api-access-6mqkd\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.100810 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.100794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:44.105966 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.105946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-bvdcl" Apr 17 07:54:44.200916 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.200883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" Apr 17 07:54:44.220379 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.220353 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn"] Apr 17 07:54:44.223144 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:54:44.223115 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef9177b8_0879_4607_8085_b87914bfa611.slice/crio-3c52d3d48b38d87740409dd063bb5308875671db20bff2caaea3ce690db84cbc WatchSource:0}: Error finding container 3c52d3d48b38d87740409dd063bb5308875671db20bff2caaea3ce690db84cbc: Status 404 returned error can't find the container with id 3c52d3d48b38d87740409dd063bb5308875671db20bff2caaea3ce690db84cbc Apr 17 07:54:44.244181 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.243152 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zrcx8"] Apr 17 07:54:44.246442 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:54:44.246397 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f854e39_db34_4c83_8d33_c1d1898b7133.slice/crio-93f11108fdeea7928f73a7558034abcc5364a866bf2b923761426913d980003b WatchSource:0}: Error finding container 93f11108fdeea7928f73a7558034abcc5364a866bf2b923761426913d980003b: Status 404 returned error can't find the container with id 93f11108fdeea7928f73a7558034abcc5364a866bf2b923761426913d980003b Apr 17 07:54:44.256493 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.256468 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-bvdcl"] Apr 17 07:54:44.259927 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:54:44.259897 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58646635_9ae5_4468_b026_e2e262f7810c.slice/crio-219a690ad2975023fddfe908916f08282484a0f60de4eb6d0196c8446ca3772a WatchSource:0}: Error finding container 219a690ad2975023fddfe908916f08282484a0f60de4eb6d0196c8446ca3772a: Status 404 returned error can't find the container with id 219a690ad2975023fddfe908916f08282484a0f60de4eb6d0196c8446ca3772a Apr 17 07:54:44.326125 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.326093 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48"] Apr 17 07:54:44.328956 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:54:44.328924 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83abce3c_9745_4587_a4d0_fc4d481c1c19.slice/crio-1f2995de53148f57b8245173fb290492e760f5b4abdf42298fdbe259cd88bb42 WatchSource:0}: Error finding container 1f2995de53148f57b8245173fb290492e760f5b4abdf42298fdbe259cd88bb42: Status 404 returned error can't find the container with id 1f2995de53148f57b8245173fb290492e760f5b4abdf42298fdbe259cd88bb42 Apr 17 07:54:44.349119 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.349083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bvdcl" event={"ID":"58646635-9ae5-4468-b026-e2e262f7810c","Type":"ContainerStarted","Data":"219a690ad2975023fddfe908916f08282484a0f60de4eb6d0196c8446ca3772a"} Apr 17 07:54:44.349939 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.349917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" event={"ID":"5f854e39-db34-4c83-8d33-c1d1898b7133","Type":"ContainerStarted","Data":"93f11108fdeea7928f73a7558034abcc5364a866bf2b923761426913d980003b"} Apr 17 07:54:44.350756 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.350735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" event={"ID":"ef9177b8-0879-4607-8085-b87914bfa611","Type":"ContainerStarted","Data":"3c52d3d48b38d87740409dd063bb5308875671db20bff2caaea3ce690db84cbc"} Apr 17 07:54:44.351478 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.351462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" event={"ID":"83abce3c-9745-4587-a4d0-fc4d481c1c19","Type":"ContainerStarted","Data":"1f2995de53148f57b8245173fb290492e760f5b4abdf42298fdbe259cd88bb42"} Apr 17 07:54:44.390614 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.390576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:44.390752 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.390697 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:44.390752 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.390751 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls podName:b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be nodeName:}" failed. No retries permitted until 2026-04-17 07:54:45.390734249 +0000 UTC m=+153.047597440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xt24b" (UID: "b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:44.591881 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.591836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.591881 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:44.591886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:44.592130 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.592000 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:44.592130 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.592073 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:45.592057576 +0000 UTC m=+153.248920770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : secret "router-metrics-certs-default" not found Apr 17 07:54:44.592130 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:44.592089 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:45.59208249 +0000 UTC m=+153.248945685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:45.400722 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:45.400609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:45.401189 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:45.400780 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:45.401189 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:45.400861 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls podName:b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be nodeName:}" failed. No retries permitted until 2026-04-17 07:54:47.400837856 +0000 UTC m=+155.057701050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xt24b" (UID: "b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:45.603396 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:45.602865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:45.603396 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:45.602926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:45.603396 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:45.603024 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:45.603396 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:45.603080 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:47.603060573 +0000 UTC m=+155.259923768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:45.603396 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:45.603103 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:47.603093287 +0000 UTC m=+155.259956482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : secret "router-metrics-certs-default" not found Apr 17 07:54:47.420850 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:47.420807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:47.421302 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:47.420949 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:47.421302 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:47.421033 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls podName:b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be nodeName:}" failed. No retries permitted until 2026-04-17 07:54:51.421009722 +0000 UTC m=+159.077872920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xt24b" (UID: "b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:47.622309 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:47.622259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:47.622467 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:47.622327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:47.622467 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:47.622387 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:47.622467 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:47.622454 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:51.622437877 +0000 UTC m=+159.279301067 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : secret "router-metrics-certs-default" not found Apr 17 07:54:47.622467 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:47.622467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:51.622461461 +0000 UTC m=+159.279324652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:48.256123 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:48.256079 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" podUID="7a34eada-e251-4bc7-8937-8f933c0cbd6f" Apr 17 07:54:48.270372 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:48.270332 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" podUID="72a398ea-84ce-463a-aa82-659a22cc916a" Apr 17 07:54:48.288586 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:48.288556 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6f2m8" podUID="b1a369df-257c-47a4-96da-3025f897b1dd" Apr 17 07:54:48.304850 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:48.304822 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9w6bd" podUID="8ff81f11-f2e2-4838-a775-e57edc28571c" Apr 17 07:54:48.363947 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.363908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" event={"ID":"83abce3c-9745-4587-a4d0-fc4d481c1c19","Type":"ContainerStarted","Data":"765aac79c956db99e91f7f160f22507b753496c935785a0c3e0712b77042fd4c"} Apr 17 07:54:48.365442 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.365410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bvdcl" event={"ID":"58646635-9ae5-4468-b026-e2e262f7810c","Type":"ContainerStarted","Data":"2ff00bd8fa5b5224ce52482d8eb6fc5fc28be92484dcd15b647544248bb53751"} Apr 17 07:54:48.366868 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.366849 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/0.log" Apr 17 07:54:48.367086 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.366887 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f854e39-db34-4c83-8d33-c1d1898b7133" containerID="92ccd57db96e7611b9ab842169228b6a14a61848f8ffba97bd2485e80f862369" exitCode=255 Apr 17 07:54:48.367086 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.366956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" event={"ID":"5f854e39-db34-4c83-8d33-c1d1898b7133","Type":"ContainerDied","Data":"92ccd57db96e7611b9ab842169228b6a14a61848f8ffba97bd2485e80f862369"} Apr 17 07:54:48.367245 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.367229 2576 scope.go:117] "RemoveContainer" containerID="92ccd57db96e7611b9ab842169228b6a14a61848f8ffba97bd2485e80f862369" Apr 17 07:54:48.368475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.368398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:54:48.368475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.368400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:54:48.368475 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.368432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" event={"ID":"ef9177b8-0879-4607-8085-b87914bfa611","Type":"ContainerStarted","Data":"e006ad9614795805c694ed4c68f559a1a618e45d71cf5bc3c5727ce357bd0eb0"} Apr 17 07:54:48.368687 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.368487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:54:48.368783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.368767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6f2m8" Apr 17 07:54:48.380822 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.380781 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" podStartSLOduration=2.029029246 podStartE2EDuration="5.380767187s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.330640488 +0000 UTC m=+151.987503679" lastFinishedPulling="2026-04-17 07:54:47.682378415 +0000 UTC m=+155.339241620" observedRunningTime="2026-04-17 07:54:48.379791852 +0000 UTC m=+156.036655069" watchObservedRunningTime="2026-04-17 07:54:48.380767187 +0000 UTC m=+156.037630405" Apr 17 07:54:48.396410 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.396363 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" podStartSLOduration=1.942852714 podStartE2EDuration="5.396346118s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.225587238 +0000 UTC m=+151.882450433" lastFinishedPulling="2026-04-17 07:54:47.67908064 +0000 UTC m=+155.335943837" observedRunningTime="2026-04-17 07:54:48.396115848 +0000 UTC m=+156.052979061" watchObservedRunningTime="2026-04-17 07:54:48.396346118 +0000 UTC m=+156.053209334" Apr 17 07:54:48.439781 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:48.439735 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-bvdcl" podStartSLOduration=2.018427862 podStartE2EDuration="5.439719623s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.262094554 +0000 UTC m=+151.918957749" lastFinishedPulling="2026-04-17 07:54:47.683386319 +0000 UTC m=+155.340249510" observedRunningTime="2026-04-17 07:54:48.438657489 +0000 UTC m=+156.095520706" watchObservedRunningTime="2026-04-17 07:54:48.439719623 +0000 UTC m=+156.096582904" Apr 17 07:54:48.874418 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:48.874383 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wxvbl" podUID="0cd46437-1e4d-4927-88fe-3d5f18ee621d" Apr 17 07:54:49.371963 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:49.371936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/1.log" Apr 17 07:54:49.372263 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:49.372251 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/0.log" Apr 17 07:54:49.372351 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:49.372303 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f854e39-db34-4c83-8d33-c1d1898b7133" containerID="8a6455c8d5065bd8e373516f6f6a04601ea1bc442ae2ab8eef18a5f2b3c2c496" exitCode=255 Apr 17 07:54:49.372417 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:49.372396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" event={"ID":"5f854e39-db34-4c83-8d33-c1d1898b7133","Type":"ContainerDied","Data":"8a6455c8d5065bd8e373516f6f6a04601ea1bc442ae2ab8eef18a5f2b3c2c496"} Apr 17 07:54:49.372452 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:49.372443 2576 scope.go:117] "RemoveContainer" containerID="92ccd57db96e7611b9ab842169228b6a14a61848f8ffba97bd2485e80f862369" Apr 17 07:54:49.373112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:49.373067 2576 scope.go:117] "RemoveContainer" containerID="8a6455c8d5065bd8e373516f6f6a04601ea1bc442ae2ab8eef18a5f2b3c2c496" Apr 17 07:54:49.376386 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:49.373430 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zrcx8_openshift-console-operator(5f854e39-db34-4c83-8d33-c1d1898b7133)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" podUID="5f854e39-db34-4c83-8d33-c1d1898b7133" Apr 17 07:54:50.376406 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:50.376382 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/1.log" Apr 17 07:54:50.376799 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:50.376748 2576 scope.go:117] "RemoveContainer" containerID="8a6455c8d5065bd8e373516f6f6a04601ea1bc442ae2ab8eef18a5f2b3c2c496" Apr 17 07:54:50.376937 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:50.376920 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zrcx8_openshift-console-operator(5f854e39-db34-4c83-8d33-c1d1898b7133)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" podUID="5f854e39-db34-4c83-8d33-c1d1898b7133" Apr 17 07:54:50.708754 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:50.708678 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xvmfc_5cf29621-68bf-43a5-94a8-643b390fca92/dns-node-resolver/0.log" Apr 17 07:54:51.457136 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.457096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:51.457551 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:51.457244 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:51.457551 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:51.457342 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls podName:b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be nodeName:}" failed. No retries permitted until 2026-04-17 07:54:59.457326111 +0000 UTC m=+167.114189301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xt24b" (UID: "b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:51.586829 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.586798 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-l6ghs"] Apr 17 07:54:51.590905 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.590885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.593414 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.593397 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 07:54:51.593797 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.593781 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 07:54:51.594390 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.594377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 07:54:51.594466 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.594453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 07:54:51.594514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.594459 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-s26nx\"" Apr 17 07:54:51.599640 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.599619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-l6ghs"] Apr 17 07:54:51.659633 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.659605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:51.659739 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.659639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:51.659739 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.659691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tnc\" (UniqueName: \"kubernetes.io/projected/dfa031bf-d993-4f40-8c00-87d6e5f68067-kube-api-access-k7tnc\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.659739 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.659710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dfa031bf-d993-4f40-8c00-87d6e5f68067-signing-key\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.659739 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.659725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dfa031bf-d993-4f40-8c00-87d6e5f68067-signing-cabundle\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.659867 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:51.659743 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:51.659867 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:51.659815 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:59.659798241 +0000 UTC m=+167.316661439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : secret "router-metrics-certs-default" not found Apr 17 07:54:51.659867 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:51.659843 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:54:59.659831759 +0000 UTC m=+167.316694953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:51.760177 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.760113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tnc\" (UniqueName: \"kubernetes.io/projected/dfa031bf-d993-4f40-8c00-87d6e5f68067-kube-api-access-k7tnc\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.760177 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.760146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dfa031bf-d993-4f40-8c00-87d6e5f68067-signing-key\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.760177 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.760168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dfa031bf-d993-4f40-8c00-87d6e5f68067-signing-cabundle\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.760844 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.760813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dfa031bf-d993-4f40-8c00-87d6e5f68067-signing-cabundle\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.762444 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.762427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dfa031bf-d993-4f40-8c00-87d6e5f68067-signing-key\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.769337 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.769302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tnc\" (UniqueName: \"kubernetes.io/projected/dfa031bf-d993-4f40-8c00-87d6e5f68067-kube-api-access-k7tnc\") pod \"service-ca-865cb79987-l6ghs\" (UID: \"dfa031bf-d993-4f40-8c00-87d6e5f68067\") " pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.899936 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.899907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-l6ghs" Apr 17 07:54:51.907825 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:51.907805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ssttk_5a79228b-b4cc-4d96-b9a2-a587214f9a0d/node-ca/0.log" Apr 17 07:54:52.014551 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:52.014474 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-l6ghs"] Apr 17 07:54:52.017813 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:54:52.017784 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa031bf_d993_4f40_8c00_87d6e5f68067.slice/crio-28ee3c4580e948b5e517bacae50d0bcb2edd86c789aaf94cba1a14a0ba1d632c WatchSource:0}: Error finding container 28ee3c4580e948b5e517bacae50d0bcb2edd86c789aaf94cba1a14a0ba1d632c: Status 404 returned error can't find the container with id 28ee3c4580e948b5e517bacae50d0bcb2edd86c789aaf94cba1a14a0ba1d632c Apr 17 07:54:52.387578 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:52.387544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-l6ghs" event={"ID":"dfa031bf-d993-4f40-8c00-87d6e5f68067","Type":"ContainerStarted","Data":"28f030f1a9542a9e67979c1a25da5264fac1f5bff0770774aa6e7173caab6885"} Apr 17 07:54:52.387578 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:52.387579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-l6ghs" event={"ID":"dfa031bf-d993-4f40-8c00-87d6e5f68067","Type":"ContainerStarted","Data":"28ee3c4580e948b5e517bacae50d0bcb2edd86c789aaf94cba1a14a0ba1d632c"} Apr 17 07:54:52.406163 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:52.406118 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-l6ghs" podStartSLOduration=1.40610487 podStartE2EDuration="1.40610487s" podCreationTimestamp="2026-04-17 07:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:52.404402331 +0000 UTC m=+160.061265556" watchObservedRunningTime="2026-04-17 07:54:52.40610487 +0000 UTC m=+160.062968294" Apr 17 07:54:53.273581 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:53.273536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") pod \"image-registry-bc6b6b8b5-xzvk9\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:53.273592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:53.273633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273637 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273661 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc6b6b8b5-xzvk9: secret "image-registry-tls" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:53.273674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273724 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273735 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls podName:72a398ea-84ce-463a-aa82-659a22cc916a nodeName:}" failed. No retries permitted until 2026-04-17 07:56:55.273715784 +0000 UTC m=+282.930579000 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls") pod "image-registry-bc6b6b8b5-xzvk9" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a") : secret "image-registry-tls" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273742 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273779 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273783 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert podName:7a34eada-e251-4bc7-8937-8f933c0cbd6f nodeName:}" failed. No retries permitted until 2026-04-17 07:56:55.273765977 +0000 UTC m=+282.930629185 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-529rs" (UID: "7a34eada-e251-4bc7-8937-8f933c0cbd6f") : secret "networking-console-plugin-cert" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273807 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls podName:b1a369df-257c-47a4-96da-3025f897b1dd nodeName:}" failed. No retries permitted until 2026-04-17 07:56:55.273797647 +0000 UTC m=+282.930660839 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls") pod "dns-default-6f2m8" (UID: "b1a369df-257c-47a4-96da-3025f897b1dd") : secret "dns-default-metrics-tls" not found Apr 17 07:54:53.274057 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:53.273830 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert podName:8ff81f11-f2e2-4838-a775-e57edc28571c nodeName:}" failed. No retries permitted until 2026-04-17 07:56:55.27381509 +0000 UTC m=+282.930678285 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert") pod "ingress-canary-9w6bd" (UID: "8ff81f11-f2e2-4838-a775-e57edc28571c") : secret "canary-serving-cert" not found Apr 17 07:54:54.101242 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:54.101199 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:54.101242 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:54.101251 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:54:54.101738 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:54.101720 2576 scope.go:117] "RemoveContainer" containerID="8a6455c8d5065bd8e373516f6f6a04601ea1bc442ae2ab8eef18a5f2b3c2c496" Apr 17 07:54:54.102125 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:54.102098 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zrcx8_openshift-console-operator(5f854e39-db34-4c83-8d33-c1d1898b7133)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" podUID="5f854e39-db34-4c83-8d33-c1d1898b7133" Apr 17 07:54:59.529722 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:59.529621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:54:59.530085 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:59.529768 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:59.530085 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:59.529846 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls podName:b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be nodeName:}" failed. No retries permitted until 2026-04-17 07:55:15.529828639 +0000 UTC m=+183.186691835 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xt24b" (UID: "b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:59.731058 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:59.731008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:59.731238 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:59.731072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:54:59.731307 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:54:59.731251 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle podName:999368dd-92dc-4226-a15b-73baf3d1e08a nodeName:}" failed. No retries permitted until 2026-04-17 07:55:15.731233043 +0000 UTC m=+183.388096239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle") pod "router-default-7bcb9d645d-ktdgl" (UID: "999368dd-92dc-4226-a15b-73baf3d1e08a") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:59.733361 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:54:59.733344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/999368dd-92dc-4226-a15b-73baf3d1e08a-metrics-certs\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:03.858572 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:03.858537 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:55:05.858848 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:05.858816 2576 scope.go:117] "RemoveContainer" containerID="8a6455c8d5065bd8e373516f6f6a04601ea1bc442ae2ab8eef18a5f2b3c2c496" Apr 17 07:55:06.427361 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:06.427335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 07:55:06.427696 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:06.427683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/1.log" Apr 17 07:55:06.427745 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:06.427713 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f854e39-db34-4c83-8d33-c1d1898b7133" containerID="cb6ba27308a948b3bcd86180cd99c166c9a996a99fc0082a7d96ada95e37d21c" exitCode=255 Apr 17 07:55:06.427796 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:06.427780 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" event={"ID":"5f854e39-db34-4c83-8d33-c1d1898b7133","Type":"ContainerDied","Data":"cb6ba27308a948b3bcd86180cd99c166c9a996a99fc0082a7d96ada95e37d21c"} Apr 17 07:55:06.427846 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:06.427809 2576 scope.go:117] "RemoveContainer" containerID="8a6455c8d5065bd8e373516f6f6a04601ea1bc442ae2ab8eef18a5f2b3c2c496" Apr 17 07:55:06.428157 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:06.428131 2576 scope.go:117] "RemoveContainer" containerID="cb6ba27308a948b3bcd86180cd99c166c9a996a99fc0082a7d96ada95e37d21c" Apr 17 07:55:06.428379 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:55:06.428360 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-zrcx8_openshift-console-operator(5f854e39-db34-4c83-8d33-c1d1898b7133)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" podUID="5f854e39-db34-4c83-8d33-c1d1898b7133" Apr 17 07:55:07.431850 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:07.431821 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 07:55:11.976128 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:11.976093 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-56dl7"] Apr 17 07:55:11.979487 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:11.979468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:11.983577 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:11.983552 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-54hk8\"" Apr 17 07:55:11.983691 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:11.983595 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:55:11.983691 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:11.983665 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:55:11.992241 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:11.992220 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-56dl7"] Apr 17 07:55:12.127712 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.127680 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/985c2da4-24ca-41d7-8de7-5cf5760588db-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.127863 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.127729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/985c2da4-24ca-41d7-8de7-5cf5760588db-crio-socket\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.127863 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.127815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/985c2da4-24ca-41d7-8de7-5cf5760588db-data-volume\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.127863 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.127858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv7q6\" (UniqueName: \"kubernetes.io/projected/985c2da4-24ca-41d7-8de7-5cf5760588db-kube-api-access-gv7q6\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.127970 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.127878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/985c2da4-24ca-41d7-8de7-5cf5760588db-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.228557 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.228468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/985c2da4-24ca-41d7-8de7-5cf5760588db-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.228557 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.228522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/985c2da4-24ca-41d7-8de7-5cf5760588db-crio-socket\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.228992 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.228587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/985c2da4-24ca-41d7-8de7-5cf5760588db-crio-socket\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.228992 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.228664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/985c2da4-24ca-41d7-8de7-5cf5760588db-data-volume\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.228992 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.228707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv7q6\" (UniqueName: \"kubernetes.io/projected/985c2da4-24ca-41d7-8de7-5cf5760588db-kube-api-access-gv7q6\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.228992 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.228727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/985c2da4-24ca-41d7-8de7-5cf5760588db-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.228992 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.228923 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/985c2da4-24ca-41d7-8de7-5cf5760588db-data-volume\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.229206 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.229101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/985c2da4-24ca-41d7-8de7-5cf5760588db-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.230920 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.230904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/985c2da4-24ca-41d7-8de7-5cf5760588db-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.239321 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.239299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv7q6\" (UniqueName: \"kubernetes.io/projected/985c2da4-24ca-41d7-8de7-5cf5760588db-kube-api-access-gv7q6\") pod \"insights-runtime-extractor-56dl7\" (UID: \"985c2da4-24ca-41d7-8de7-5cf5760588db\") " pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.288377 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.288347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-56dl7" Apr 17 07:55:12.404606 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.404575 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-56dl7"] Apr 17 07:55:12.408141 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:12.408110 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod985c2da4_24ca_41d7_8de7_5cf5760588db.slice/crio-dba07d6bbaac2b33089a384da966f23ff741f838494a34ddcb11e70c949ed126 WatchSource:0}: Error finding container dba07d6bbaac2b33089a384da966f23ff741f838494a34ddcb11e70c949ed126: Status 404 returned error can't find the container with id dba07d6bbaac2b33089a384da966f23ff741f838494a34ddcb11e70c949ed126 Apr 17 07:55:12.444955 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:12.444922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56dl7" event={"ID":"985c2da4-24ca-41d7-8de7-5cf5760588db","Type":"ContainerStarted","Data":"dba07d6bbaac2b33089a384da966f23ff741f838494a34ddcb11e70c949ed126"} Apr 17 07:55:13.449078 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:13.449042 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56dl7" event={"ID":"985c2da4-24ca-41d7-8de7-5cf5760588db","Type":"ContainerStarted","Data":"736b0a384eabe915664e6a715e28156c2b6da4a794e453204d528a4c067848c7"} Apr 17 07:55:13.449078 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:13.449083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56dl7" event={"ID":"985c2da4-24ca-41d7-8de7-5cf5760588db","Type":"ContainerStarted","Data":"8dd5cefc2f4e67523b8a6f151fc63ede0e4c21af39eb21413fc66f76a39ced0c"} Apr 17 07:55:14.101196 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:14.101157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:55:14.101196 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:14.101199 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:55:14.101592 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:14.101579 2576 scope.go:117] "RemoveContainer" containerID="cb6ba27308a948b3bcd86180cd99c166c9a996a99fc0082a7d96ada95e37d21c" Apr 17 07:55:14.101820 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:55:14.101789 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-zrcx8_openshift-console-operator(5f854e39-db34-4c83-8d33-c1d1898b7133)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" podUID="5f854e39-db34-4c83-8d33-c1d1898b7133" Apr 17 07:55:15.458493 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.458460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56dl7" event={"ID":"985c2da4-24ca-41d7-8de7-5cf5760588db","Type":"ContainerStarted","Data":"93ea0eee0e0459249dcdca127f04c5ad76084a5d69602acd886343db4b15824c"} Apr 17 07:55:15.475734 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.475678 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-56dl7" podStartSLOduration=2.470028186 podStartE2EDuration="4.475664864s" podCreationTimestamp="2026-04-17 07:55:11 +0000 UTC" firstStartedPulling="2026-04-17 07:55:12.462859019 +0000 UTC m=+180.119722209" lastFinishedPulling="2026-04-17 07:55:14.468495697 +0000 UTC m=+182.125358887" observedRunningTime="2026-04-17 07:55:15.474486608 +0000 UTC m=+183.131349821" watchObservedRunningTime="2026-04-17 07:55:15.475664864 +0000 UTC m=+183.132528076" Apr 17 07:55:15.557579 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.557535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:55:15.560029 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.560004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xt24b\" (UID: \"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:55:15.596448 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.596417 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-89tkw\"" Apr 17 07:55:15.604436 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.604413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" Apr 17 07:55:15.734770 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.734690 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b"] Apr 17 07:55:15.738235 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:15.738206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39b6955_dc2b_4aa2_9bc9_eb1ea55b33be.slice/crio-d4251473b12ce0c17a0d60c55bd21c41af292fb12e4f8712388466961451e10d WatchSource:0}: Error finding container d4251473b12ce0c17a0d60c55bd21c41af292fb12e4f8712388466961451e10d: Status 404 returned error can't find the container with id d4251473b12ce0c17a0d60c55bd21c41af292fb12e4f8712388466961451e10d Apr 17 07:55:15.759860 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.759829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:15.760479 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.760460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/999368dd-92dc-4226-a15b-73baf3d1e08a-service-ca-bundle\") pod \"router-default-7bcb9d645d-ktdgl\" (UID: \"999368dd-92dc-4226-a15b-73baf3d1e08a\") " pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:15.786079 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.786052 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-b9kgr\"" Apr 17 07:55:15.794449 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.794430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:15.906772 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:15.906676 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bcb9d645d-ktdgl"] Apr 17 07:55:15.909128 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:15.909097 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod999368dd_92dc_4226_a15b_73baf3d1e08a.slice/crio-3e58939d91e060142b9299133177b4e0cea09df6feec6fb08de1db820e450100 WatchSource:0}: Error finding container 3e58939d91e060142b9299133177b4e0cea09df6feec6fb08de1db820e450100: Status 404 returned error can't find the container with id 3e58939d91e060142b9299133177b4e0cea09df6feec6fb08de1db820e450100 Apr 17 07:55:16.466343 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:16.466300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" event={"ID":"999368dd-92dc-4226-a15b-73baf3d1e08a","Type":"ContainerStarted","Data":"2f17fbaa40b1c8d9bb14ea547a849381e8e1f7684746b178abc13d31396b3ea2"} Apr 17 07:55:16.466343 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:16.466336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" event={"ID":"999368dd-92dc-4226-a15b-73baf3d1e08a","Type":"ContainerStarted","Data":"3e58939d91e060142b9299133177b4e0cea09df6feec6fb08de1db820e450100"} Apr 17 07:55:16.467339 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:16.467315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" event={"ID":"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be","Type":"ContainerStarted","Data":"d4251473b12ce0c17a0d60c55bd21c41af292fb12e4f8712388466961451e10d"} Apr 17 07:55:16.484510 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:16.484463 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" podStartSLOduration=33.484451201 podStartE2EDuration="33.484451201s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:55:16.483419986 +0000 UTC m=+184.140283199" watchObservedRunningTime="2026-04-17 07:55:16.484451201 +0000 UTC m=+184.141314413" Apr 17 07:55:16.795207 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:16.795117 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:16.798181 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:16.798154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:17.470610 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:17.470575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:17.471808 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:17.471786 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7bcb9d645d-ktdgl" Apr 17 07:55:17.973340 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:17.973307 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl"] Apr 17 07:55:17.976411 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:17.976388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" Apr 17 07:55:17.978839 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:17.978815 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 07:55:17.979165 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:17.979151 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-854pt\"" Apr 17 07:55:17.985835 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:17.985812 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl"] Apr 17 07:55:18.082149 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.082113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/52412a8f-a644-49c4-9da0-e872be529692-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8nzhl\" (UID: \"52412a8f-a644-49c4-9da0-e872be529692\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" Apr 17 07:55:18.183440 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.183395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/52412a8f-a644-49c4-9da0-e872be529692-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8nzhl\" (UID: \"52412a8f-a644-49c4-9da0-e872be529692\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" Apr 17 07:55:18.185827 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.185795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/52412a8f-a644-49c4-9da0-e872be529692-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8nzhl\" (UID: \"52412a8f-a644-49c4-9da0-e872be529692\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" Apr 17 07:55:18.286310 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.286193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" Apr 17 07:55:18.404557 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.404519 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl"] Apr 17 07:55:18.407598 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:18.407570 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52412a8f_a644_49c4_9da0_e872be529692.slice/crio-45bb48847b6118592f3e8825c0b9227f242608e9f165ab13da8909330d4d1dd4 WatchSource:0}: Error finding container 45bb48847b6118592f3e8825c0b9227f242608e9f165ab13da8909330d4d1dd4: Status 404 returned error can't find the container with id 45bb48847b6118592f3e8825c0b9227f242608e9f165ab13da8909330d4d1dd4 Apr 17 07:55:18.473617 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.473581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" event={"ID":"b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be","Type":"ContainerStarted","Data":"eca1429752d49b420d7d080a858ff9063677d0bfb005e287bfb5fa6aff60a6ae"} Apr 17 07:55:18.474556 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.474529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" event={"ID":"52412a8f-a644-49c4-9da0-e872be529692","Type":"ContainerStarted","Data":"45bb48847b6118592f3e8825c0b9227f242608e9f165ab13da8909330d4d1dd4"} Apr 17 07:55:18.489828 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:18.489777 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xt24b" podStartSLOduration=33.755553053 podStartE2EDuration="35.489763729s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:55:15.739948001 +0000 UTC m=+183.396811193" lastFinishedPulling="2026-04-17 07:55:17.474158675 +0000 UTC m=+185.131021869" observedRunningTime="2026-04-17 07:55:18.488731287 +0000 UTC m=+186.145594502" watchObservedRunningTime="2026-04-17 07:55:18.489763729 +0000 UTC m=+186.146627011" Apr 17 07:55:20.483847 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:20.483815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" event={"ID":"52412a8f-a644-49c4-9da0-e872be529692","Type":"ContainerStarted","Data":"b43f7c85f716cd5a5dbd7d689b8096e5e9511bfb9a6efdb3f38899079f7241e9"} Apr 17 07:55:20.484267 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:20.483990 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" Apr 17 07:55:20.488726 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:20.488701 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" Apr 17 07:55:20.500813 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:20.500771 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nzhl" podStartSLOduration=2.4490086460000002 podStartE2EDuration="3.500757893s" podCreationTimestamp="2026-04-17 07:55:17 +0000 UTC" firstStartedPulling="2026-04-17 07:55:18.409535747 +0000 UTC m=+186.066398937" lastFinishedPulling="2026-04-17 07:55:19.461284979 +0000 UTC m=+187.118148184" observedRunningTime="2026-04-17 07:55:20.499743497 +0000 UTC m=+188.156606710" watchObservedRunningTime="2026-04-17 07:55:20.500757893 +0000 UTC m=+188.157621119" Apr 17 07:55:21.028872 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.028834 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jmncg"] Apr 17 07:55:21.056899 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.056866 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jmncg"] Apr 17 07:55:21.057057 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.056973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.060538 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.060508 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 07:55:21.060538 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.060526 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 07:55:21.060743 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.060526 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:55:21.060743 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.060559 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-r82xj\"" Apr 17 07:55:21.206382 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.206342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w97bn\" (UniqueName: \"kubernetes.io/projected/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-kube-api-access-w97bn\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.206382 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.206387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.206649 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.206413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.206649 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.206431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.307548 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.307474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.307548 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.307516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.307548 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.307537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.307905 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.307635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w97bn\" (UniqueName: \"kubernetes.io/projected/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-kube-api-access-w97bn\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.307905 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:55:21.307649 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 07:55:21.307905 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:55:21.307725 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-tls podName:17ff8df6-8cf5-42bd-923b-68b156ef4cf5 nodeName:}" failed. No retries permitted until 2026-04-17 07:55:21.807700765 +0000 UTC m=+189.464563975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-jmncg" (UID: "17ff8df6-8cf5-42bd-923b-68b156ef4cf5") : secret "prometheus-operator-tls" not found Apr 17 07:55:21.308319 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.308299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.309978 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.309955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.316353 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.316327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w97bn\" (UniqueName: \"kubernetes.io/projected/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-kube-api-access-w97bn\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.811398 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.811363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.813791 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.813768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/17ff8df6-8cf5-42bd-923b-68b156ef4cf5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jmncg\" (UID: \"17ff8df6-8cf5-42bd-923b-68b156ef4cf5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:21.966290 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:21.966249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" Apr 17 07:55:22.083871 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:22.083759 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jmncg"] Apr 17 07:55:22.098294 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:22.098252 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17ff8df6_8cf5_42bd_923b_68b156ef4cf5.slice/crio-dd535e018eff8fceb0ed574e3225353c6d799444c15d68614a01d63afa21ac3f WatchSource:0}: Error finding container dd535e018eff8fceb0ed574e3225353c6d799444c15d68614a01d63afa21ac3f: Status 404 returned error can't find the container with id dd535e018eff8fceb0ed574e3225353c6d799444c15d68614a01d63afa21ac3f Apr 17 07:55:22.490540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:22.490507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" event={"ID":"17ff8df6-8cf5-42bd-923b-68b156ef4cf5","Type":"ContainerStarted","Data":"dd535e018eff8fceb0ed574e3225353c6d799444c15d68614a01d63afa21ac3f"} Apr 17 07:55:23.494788 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:23.494693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" event={"ID":"17ff8df6-8cf5-42bd-923b-68b156ef4cf5","Type":"ContainerStarted","Data":"a68f8802f035d705cd907355ce1f8ad627e43b9c517facb384d6a7dbe9da6e5f"} Apr 17 07:55:23.494788 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:23.494728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" event={"ID":"17ff8df6-8cf5-42bd-923b-68b156ef4cf5","Type":"ContainerStarted","Data":"76eff1c860e01d0a4f2e2c3ae1e540418e4918898192b142905f71ef1594186c"} Apr 17 07:55:23.510454 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:23.510403 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jmncg" podStartSLOduration=1.387796107 podStartE2EDuration="2.510374602s" podCreationTimestamp="2026-04-17 07:55:21 +0000 UTC" firstStartedPulling="2026-04-17 07:55:22.100026937 +0000 UTC m=+189.756890128" lastFinishedPulling="2026-04-17 07:55:23.222605417 +0000 UTC m=+190.879468623" observedRunningTime="2026-04-17 07:55:23.509805475 +0000 UTC m=+191.166668689" watchObservedRunningTime="2026-04-17 07:55:23.510374602 +0000 UTC m=+191.167237814" Apr 17 07:55:25.363229 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.363190 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-59scv"] Apr 17 07:55:25.367036 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.367013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.369600 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.369577 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 07:55:25.369804 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.369775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 07:55:25.369961 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.369660 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-72gkb\"" Apr 17 07:55:25.373386 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.373362 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h7xpg"] Apr 17 07:55:25.377042 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.377020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.380150 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.380128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bpbfn\"" Apr 17 07:55:25.380763 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.380746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:55:25.380917 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.380896 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:55:25.381296 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.381256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:55:25.383497 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.383477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-59scv"] Apr 17 07:55:25.442989 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.442958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.443188 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.443056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.443188 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.443093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj6ld\" (UniqueName: \"kubernetes.io/projected/f077267b-3589-45a9-a1ef-ad4d07a595cf-kube-api-access-kj6ld\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.443188 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.443150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f077267b-3589-45a9-a1ef-ad4d07a595cf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.544181 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-accelerators-collector-config\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544181 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p72gv\" (UniqueName: \"kubernetes.io/projected/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-kube-api-access-p72gv\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544466 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-tls\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544466 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-textfile\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544466 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f077267b-3589-45a9-a1ef-ad4d07a595cf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.544466 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-root\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544466 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-wtmp\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544709 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544709 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.544709 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:55:25.544673 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 07:55:25.544709 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-metrics-client-ca\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.544909 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.544909 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:55:25.544744 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-tls podName:f077267b-3589-45a9-a1ef-ad4d07a595cf nodeName:}" failed. No retries permitted until 2026-04-17 07:55:26.044724169 +0000 UTC m=+193.701587366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-59scv" (UID: "f077267b-3589-45a9-a1ef-ad4d07a595cf") : secret "openshift-state-metrics-tls" not found Apr 17 07:55:25.544909 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj6ld\" (UniqueName: \"kubernetes.io/projected/f077267b-3589-45a9-a1ef-ad4d07a595cf-kube-api-access-kj6ld\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.544909 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.544827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-sys\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.545159 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.545139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f077267b-3589-45a9-a1ef-ad4d07a595cf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.547042 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.547022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.555925 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.555895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj6ld\" (UniqueName: \"kubernetes.io/projected/f077267b-3589-45a9-a1ef-ad4d07a595cf-kube-api-access-kj6ld\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:25.646090 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.645995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-root\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646090 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-wtmp\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646090 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-root\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-metrics-client-ca\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-sys\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-accelerators-collector-config\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p72gv\" (UniqueName: \"kubernetes.io/projected/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-kube-api-access-p72gv\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-wtmp\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-tls\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-textfile\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-sys\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646758 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-textfile\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646886 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-accelerators-collector-config\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.646970 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.646949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-metrics-client-ca\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.648529 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.648513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.648615 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.648595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-node-exporter-tls\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.654073 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.654051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p72gv\" (UniqueName: \"kubernetes.io/projected/5c6eb6cc-f216-4722-8e3d-297a548b8b5d-kube-api-access-p72gv\") pod \"node-exporter-h7xpg\" (UID: \"5c6eb6cc-f216-4722-8e3d-297a548b8b5d\") " pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.691245 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:25.691209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h7xpg" Apr 17 07:55:25.700683 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:25.700646 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c6eb6cc_f216_4722_8e3d_297a548b8b5d.slice/crio-8ef618fdea286ea9135357be719aa3e4a569997ca7224385d363fadc1f645b78 WatchSource:0}: Error finding container 8ef618fdea286ea9135357be719aa3e4a569997ca7224385d363fadc1f645b78: Status 404 returned error can't find the container with id 8ef618fdea286ea9135357be719aa3e4a569997ca7224385d363fadc1f645b78 Apr 17 07:55:26.051433 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:26.051343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:26.053678 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:26.053650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f077267b-3589-45a9-a1ef-ad4d07a595cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-59scv\" (UID: \"f077267b-3589-45a9-a1ef-ad4d07a595cf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:26.281129 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:26.281071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" Apr 17 07:55:26.434553 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:26.434515 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-59scv"] Apr 17 07:55:26.504291 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:26.504242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7xpg" event={"ID":"5c6eb6cc-f216-4722-8e3d-297a548b8b5d","Type":"ContainerStarted","Data":"8ef618fdea286ea9135357be719aa3e4a569997ca7224385d363fadc1f645b78"} Apr 17 07:55:26.520668 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:26.520635 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf077267b_3589_45a9_a1ef_ad4d07a595cf.slice/crio-dd6a54d38907cbb9273449661418263373b69570d5eb3d417ddc00e1684c0a95 WatchSource:0}: Error finding container dd6a54d38907cbb9273449661418263373b69570d5eb3d417ddc00e1684c0a95: Status 404 returned error can't find the container with id dd6a54d38907cbb9273449661418263373b69570d5eb3d417ddc00e1684c0a95 Apr 17 07:55:27.508775 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:27.508734 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c6eb6cc-f216-4722-8e3d-297a548b8b5d" containerID="6c02133cf88381de65ab18d442eaef85ba05d5fd84ecdf84f63b35fdc8736be3" exitCode=0 Apr 17 07:55:27.509206 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:27.508811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7xpg" event={"ID":"5c6eb6cc-f216-4722-8e3d-297a548b8b5d","Type":"ContainerDied","Data":"6c02133cf88381de65ab18d442eaef85ba05d5fd84ecdf84f63b35fdc8736be3"} Apr 17 07:55:27.510449 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:27.510431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" event={"ID":"f077267b-3589-45a9-a1ef-ad4d07a595cf","Type":"ContainerStarted","Data":"fb151540731c9bd6a0f83ca912626ffbfc23e4acf9aa98027e447742855cc92e"} Apr 17 07:55:27.510509 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:27.510455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" event={"ID":"f077267b-3589-45a9-a1ef-ad4d07a595cf","Type":"ContainerStarted","Data":"d16cf8a69d2c18cf58933bf2ea3f1f59403c03d2f929b4b162fe784ecda365ac"} Apr 17 07:55:27.510509 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:27.510466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" event={"ID":"f077267b-3589-45a9-a1ef-ad4d07a595cf","Type":"ContainerStarted","Data":"dd6a54d38907cbb9273449661418263373b69570d5eb3d417ddc00e1684c0a95"} Apr 17 07:55:28.515180 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:28.515141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" event={"ID":"f077267b-3589-45a9-a1ef-ad4d07a595cf","Type":"ContainerStarted","Data":"236b0fc312b3945285c95125ba0710ac8c7da53cf00d16a7b8aaa5762e7c0501"} Apr 17 07:55:28.517181 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:28.517157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7xpg" event={"ID":"5c6eb6cc-f216-4722-8e3d-297a548b8b5d","Type":"ContainerStarted","Data":"e6da6dd2ec901c72884fa983582ac9c790f08f63fbcc5f44bb6844a1f2d8c034"} Apr 17 07:55:28.517304 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:28.517186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h7xpg" event={"ID":"5c6eb6cc-f216-4722-8e3d-297a548b8b5d","Type":"ContainerStarted","Data":"0fb1589cf77ed1b87e5f409848849f032b32a8bdaaae49fa98ce99bc742ba971"} Apr 17 07:55:28.534903 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:28.534842 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-59scv" podStartSLOduration=2.6498257069999998 podStartE2EDuration="3.534823572s" podCreationTimestamp="2026-04-17 07:55:25 +0000 UTC" firstStartedPulling="2026-04-17 07:55:26.654736421 +0000 UTC m=+194.311599616" lastFinishedPulling="2026-04-17 07:55:27.539734287 +0000 UTC m=+195.196597481" observedRunningTime="2026-04-17 07:55:28.533572918 +0000 UTC m=+196.190436133" watchObservedRunningTime="2026-04-17 07:55:28.534823572 +0000 UTC m=+196.191686787" Apr 17 07:55:28.553906 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:28.553858 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h7xpg" podStartSLOduration=2.689864669 podStartE2EDuration="3.553843187s" podCreationTimestamp="2026-04-17 07:55:25 +0000 UTC" firstStartedPulling="2026-04-17 07:55:25.702893456 +0000 UTC m=+193.359756652" lastFinishedPulling="2026-04-17 07:55:26.566871962 +0000 UTC m=+194.223735170" observedRunningTime="2026-04-17 07:55:28.553156022 +0000 UTC m=+196.210019235" watchObservedRunningTime="2026-04-17 07:55:28.553843187 +0000 UTC m=+196.210706401" Apr 17 07:55:28.858433 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:28.858400 2576 scope.go:117] "RemoveContainer" containerID="cb6ba27308a948b3bcd86180cd99c166c9a996a99fc0082a7d96ada95e37d21c" Apr 17 07:55:29.522319 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.522266 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 07:55:29.522793 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.522529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" event={"ID":"5f854e39-db34-4c83-8d33-c1d1898b7133","Type":"ContainerStarted","Data":"e0a12ab2834c648e47bf5c82216d92155593b34ff0a3b73cfacedafa117e9b9e"} Apr 17 07:55:29.523054 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.523028 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:55:29.527858 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.527830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" Apr 17 07:55:29.540919 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.540853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-zrcx8" podStartSLOduration=43.109908757 podStartE2EDuration="46.540835469s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.248138693 +0000 UTC m=+151.905001884" lastFinishedPulling="2026-04-17 07:54:47.679065404 +0000 UTC m=+155.335928596" observedRunningTime="2026-04-17 07:55:29.540825848 +0000 UTC m=+197.197689061" watchObservedRunningTime="2026-04-17 07:55:29.540835469 +0000 UTC m=+197.197698683" Apr 17 07:55:29.689367 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.689338 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-6594d"] Apr 17 07:55:29.692505 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.692486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6594d" Apr 17 07:55:29.694795 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.694772 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:55:29.694908 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.694771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-xt5df\"" Apr 17 07:55:29.694908 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.694778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:55:29.702565 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.702545 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6594d"] Apr 17 07:55:29.787412 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.787314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxxw\" (UniqueName: \"kubernetes.io/projected/3779ef4e-6fc6-4738-bab0-223e27cbd53b-kube-api-access-jwxxw\") pod \"downloads-6bcc868b7-6594d\" (UID: \"3779ef4e-6fc6-4738-bab0-223e27cbd53b\") " pod="openshift-console/downloads-6bcc868b7-6594d" Apr 17 07:55:29.888698 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.888660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxxw\" (UniqueName: \"kubernetes.io/projected/3779ef4e-6fc6-4738-bab0-223e27cbd53b-kube-api-access-jwxxw\") pod \"downloads-6bcc868b7-6594d\" (UID: \"3779ef4e-6fc6-4738-bab0-223e27cbd53b\") " pod="openshift-console/downloads-6bcc868b7-6594d" Apr 17 07:55:29.897478 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:29.897454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxxw\" (UniqueName: \"kubernetes.io/projected/3779ef4e-6fc6-4738-bab0-223e27cbd53b-kube-api-access-jwxxw\") pod \"downloads-6bcc868b7-6594d\" (UID: \"3779ef4e-6fc6-4738-bab0-223e27cbd53b\") " pod="openshift-console/downloads-6bcc868b7-6594d" Apr 17 07:55:30.001590 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:30.001557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6594d" Apr 17 07:55:30.128819 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:30.128784 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6594d"] Apr 17 07:55:30.133809 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:30.133769 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3779ef4e_6fc6_4738_bab0_223e27cbd53b.slice/crio-664e68addb4595e3a8f216b0b6123ad8e168b5939a77515541a14bd4bc3aa140 WatchSource:0}: Error finding container 664e68addb4595e3a8f216b0b6123ad8e168b5939a77515541a14bd4bc3aa140: Status 404 returned error can't find the container with id 664e68addb4595e3a8f216b0b6123ad8e168b5939a77515541a14bd4bc3aa140 Apr 17 07:55:30.525914 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:30.525876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6594d" event={"ID":"3779ef4e-6fc6-4738-bab0-223e27cbd53b","Type":"ContainerStarted","Data":"664e68addb4595e3a8f216b0b6123ad8e168b5939a77515541a14bd4bc3aa140"} Apr 17 07:55:31.649359 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.649323 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:55:31.655655 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.655610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.658404 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.658373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.659512 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.659884 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.660732 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.660940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.661133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.661681 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.661857 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.661901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.661911 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.662039 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8u5ojncutude3\"" Apr 17 07:55:31.662049 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.662053 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-85w9j\"" Apr 17 07:55:31.662676 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.662078 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 07:55:31.662676 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.662116 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 07:55:31.663860 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.663836 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 07:55:31.668815 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.668767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:55:31.809401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809596 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nkz9\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-kube-api-access-6nkz9\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809596 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809596 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809773 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809773 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809773 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809773 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.809773 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-web-config\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810020 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810020 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810020 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.809973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810020 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.810007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810217 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.810084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810217 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.810124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810217 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.810152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config-out\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.810221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.810401 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.810248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.911054 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.910956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.911054 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.911015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config-out\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.911299 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.911066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.911750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.911799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.911854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nkz9\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-kube-api-access-6nkz9\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.911897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.911937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.912004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.912034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.912062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.912121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.912374 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.912143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-web-config\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.913157 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.913031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.913791 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.913766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.913882 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.913788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.913882 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.913859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.913988 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.913895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.913988 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.913923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.913988 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.913955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.914146 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.914099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.914783 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.914754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.917561 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.917228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.917561 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.917307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.917561 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.917360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.917561 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.917457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.917923 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.917895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config-out\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.918636 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.918610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.921143 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.920704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-web-config\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.921143 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.921100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.921143 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.921102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.921992 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.921804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.923305 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.922679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.923305 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.923200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.923305 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.923247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.924737 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.923810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nkz9\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-kube-api-access-6nkz9\") pod \"prometheus-k8s-0\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:31.970822 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:31.970711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:32.136757 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:32.136702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:55:32.140600 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:55:32.140572 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c57183_ce96_44f3_9b36_e53c4d00f0e6.slice/crio-01f2bcc7738ae2cc4810bc886c65fce65a13bc20730ee4e969068caf84a10b3c WatchSource:0}: Error finding container 01f2bcc7738ae2cc4810bc886c65fce65a13bc20730ee4e969068caf84a10b3c: Status 404 returned error can't find the container with id 01f2bcc7738ae2cc4810bc886c65fce65a13bc20730ee4e969068caf84a10b3c Apr 17 07:55:32.535324 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:32.535202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerStarted","Data":"01f2bcc7738ae2cc4810bc886c65fce65a13bc20730ee4e969068caf84a10b3c"} Apr 17 07:55:33.539435 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:33.539338 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerID="74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3" exitCode=0 Apr 17 07:55:33.539847 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:33.539429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3"} Apr 17 07:55:33.697228 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:33.697184 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bc6b6b8b5-xzvk9"] Apr 17 07:55:33.697529 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:55:33.697499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" podUID="72a398ea-84ce-463a-aa82-659a22cc916a" Apr 17 07:55:34.542448 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.542419 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:55:34.548573 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.548475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:55:34.741752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.741402 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-bound-sa-token\") pod \"72a398ea-84ce-463a-aa82-659a22cc916a\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " Apr 17 07:55:34.741752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.741447 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72a398ea-84ce-463a-aa82-659a22cc916a-ca-trust-extracted\") pod \"72a398ea-84ce-463a-aa82-659a22cc916a\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " Apr 17 07:55:34.741752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.741510 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-trusted-ca\") pod \"72a398ea-84ce-463a-aa82-659a22cc916a\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " Apr 17 07:55:34.741752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.741542 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-image-registry-private-configuration\") pod \"72a398ea-84ce-463a-aa82-659a22cc916a\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " Apr 17 07:55:34.741752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.741611 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-registry-certificates\") pod \"72a398ea-84ce-463a-aa82-659a22cc916a\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " Apr 17 07:55:34.741752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.741643 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49rsw\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-kube-api-access-49rsw\") pod \"72a398ea-84ce-463a-aa82-659a22cc916a\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " Apr 17 07:55:34.741752 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.741677 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-installation-pull-secrets\") pod \"72a398ea-84ce-463a-aa82-659a22cc916a\" (UID: \"72a398ea-84ce-463a-aa82-659a22cc916a\") " Apr 17 07:55:34.742233 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.742044 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a398ea-84ce-463a-aa82-659a22cc916a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "72a398ea-84ce-463a-aa82-659a22cc916a" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:55:34.742336 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.742292 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "72a398ea-84ce-463a-aa82-659a22cc916a" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:34.742336 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.742334 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "72a398ea-84ce-463a-aa82-659a22cc916a" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:34.742962 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.742799 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-registry-certificates\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:34.742962 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.742823 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72a398ea-84ce-463a-aa82-659a22cc916a-ca-trust-extracted\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:34.742962 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.742838 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72a398ea-84ce-463a-aa82-659a22cc916a-trusted-ca\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:34.744622 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.744569 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "72a398ea-84ce-463a-aa82-659a22cc916a" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:34.745191 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.745165 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "72a398ea-84ce-463a-aa82-659a22cc916a" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:34.745363 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.745340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-kube-api-access-49rsw" (OuterVolumeSpecName: "kube-api-access-49rsw") pod "72a398ea-84ce-463a-aa82-659a22cc916a" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a"). InnerVolumeSpecName "kube-api-access-49rsw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:34.745442 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.745417 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "72a398ea-84ce-463a-aa82-659a22cc916a" (UID: "72a398ea-84ce-463a-aa82-659a22cc916a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:34.844395 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.844356 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-image-registry-private-configuration\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:34.844395 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.844393 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49rsw\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-kube-api-access-49rsw\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:34.844646 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.844409 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72a398ea-84ce-463a-aa82-659a22cc916a-installation-pull-secrets\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:34.844646 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:34.844425 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-bound-sa-token\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:35.550167 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:35.547318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc6b6b8b5-xzvk9" Apr 17 07:55:35.584472 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:35.584423 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bc6b6b8b5-xzvk9"] Apr 17 07:55:35.588765 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:35.588733 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-bc6b6b8b5-xzvk9"] Apr 17 07:55:35.652706 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:35.652662 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72a398ea-84ce-463a-aa82-659a22cc916a-registry-tls\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:55:36.863740 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:36.863618 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a398ea-84ce-463a-aa82-659a22cc916a" path="/var/lib/kubelet/pods/72a398ea-84ce-463a-aa82-659a22cc916a/volumes" Apr 17 07:55:37.558400 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:37.558313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerStarted","Data":"1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03"} Apr 17 07:55:37.558400 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:37.558361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerStarted","Data":"2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e"} Apr 17 07:55:39.568534 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:39.568496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerStarted","Data":"32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5"} Apr 17 07:55:39.568534 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:39.568534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerStarted","Data":"119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac"} Apr 17 07:55:39.568963 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:39.568545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerStarted","Data":"fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87"} Apr 17 07:55:39.568963 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:39.568555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerStarted","Data":"544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3"} Apr 17 07:55:39.595903 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:39.595711 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.141111255 podStartE2EDuration="8.595691812s" podCreationTimestamp="2026-04-17 07:55:31 +0000 UTC" firstStartedPulling="2026-04-17 07:55:32.14292384 +0000 UTC m=+199.799787030" lastFinishedPulling="2026-04-17 07:55:38.59750438 +0000 UTC m=+206.254367587" observedRunningTime="2026-04-17 07:55:39.594183955 +0000 UTC m=+207.251047168" watchObservedRunningTime="2026-04-17 07:55:39.595691812 +0000 UTC m=+207.252555032" Apr 17 07:55:41.971243 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:41.971197 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:47.602946 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:47.602905 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6594d" event={"ID":"3779ef4e-6fc6-4738-bab0-223e27cbd53b","Type":"ContainerStarted","Data":"b726b4150dce2fe2b728abd8ba4f47ffbc474e8053e914843b214b65a9705e8f"} Apr 17 07:55:47.603459 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:47.603267 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-6594d" Apr 17 07:55:47.614460 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:47.614430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-6594d" Apr 17 07:55:47.620365 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:55:47.620307 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-6594d" podStartSLOduration=1.892006106 podStartE2EDuration="18.620266612s" podCreationTimestamp="2026-04-17 07:55:29 +0000 UTC" firstStartedPulling="2026-04-17 07:55:30.136492573 +0000 UTC m=+197.793355764" lastFinishedPulling="2026-04-17 07:55:46.864753074 +0000 UTC m=+214.521616270" observedRunningTime="2026-04-17 07:55:47.618519156 +0000 UTC m=+215.275382393" watchObservedRunningTime="2026-04-17 07:55:47.620266612 +0000 UTC m=+215.277129825" Apr 17 07:56:03.648955 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:03.648868 2576 generic.go:358] "Generic (PLEG): container finished" podID="83abce3c-9745-4587-a4d0-fc4d481c1c19" containerID="765aac79c956db99e91f7f160f22507b753496c935785a0c3e0712b77042fd4c" exitCode=0 Apr 17 07:56:03.648955 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:03.648942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" event={"ID":"83abce3c-9745-4587-a4d0-fc4d481c1c19","Type":"ContainerDied","Data":"765aac79c956db99e91f7f160f22507b753496c935785a0c3e0712b77042fd4c"} Apr 17 07:56:03.649431 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:03.649252 2576 scope.go:117] "RemoveContainer" containerID="765aac79c956db99e91f7f160f22507b753496c935785a0c3e0712b77042fd4c" Apr 17 07:56:04.652718 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:04.652684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4dn48" event={"ID":"83abce3c-9745-4587-a4d0-fc4d481c1c19","Type":"ContainerStarted","Data":"78e8fb5710f6dc852a024cf7c0a7dd199f415d08ad8bbaf235ec8bfa322b3389"} Apr 17 07:56:13.676638 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:13.676603 2576 generic.go:358] "Generic (PLEG): container finished" podID="58646635-9ae5-4468-b026-e2e262f7810c" containerID="2ff00bd8fa5b5224ce52482d8eb6fc5fc28be92484dcd15b647544248bb53751" exitCode=0 Apr 17 07:56:13.677013 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:13.676679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bvdcl" event={"ID":"58646635-9ae5-4468-b026-e2e262f7810c","Type":"ContainerDied","Data":"2ff00bd8fa5b5224ce52482d8eb6fc5fc28be92484dcd15b647544248bb53751"} Apr 17 07:56:13.677013 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:13.677007 2576 scope.go:117] "RemoveContainer" containerID="2ff00bd8fa5b5224ce52482d8eb6fc5fc28be92484dcd15b647544248bb53751" Apr 17 07:56:14.680758 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:14.680716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-bvdcl" event={"ID":"58646635-9ae5-4468-b026-e2e262f7810c","Type":"ContainerStarted","Data":"246c653ddbe2471c3a446b5d895f0a220d164a36f45891bda528a4ec66e2fc4c"} Apr 17 07:56:14.681866 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:14.681842 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef9177b8-0879-4607-8085-b87914bfa611" containerID="e006ad9614795805c694ed4c68f559a1a618e45d71cf5bc3c5727ce357bd0eb0" exitCode=0 Apr 17 07:56:14.681984 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:14.681907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" event={"ID":"ef9177b8-0879-4607-8085-b87914bfa611","Type":"ContainerDied","Data":"e006ad9614795805c694ed4c68f559a1a618e45d71cf5bc3c5727ce357bd0eb0"} Apr 17 07:56:14.682216 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:14.682201 2576 scope.go:117] "RemoveContainer" containerID="e006ad9614795805c694ed4c68f559a1a618e45d71cf5bc3c5727ce357bd0eb0" Apr 17 07:56:15.686131 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:15.686090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fn8dn" event={"ID":"ef9177b8-0879-4607-8085-b87914bfa611","Type":"ContainerStarted","Data":"6a9ec779ea6baf1475d862213507d25146998e5ce6d8f7d8b124ace0b68912ed"} Apr 17 07:56:24.634032 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:24.633996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:56:24.648355 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:24.648322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cd46437-1e4d-4927-88fe-3d5f18ee621d-metrics-certs\") pod \"network-metrics-daemon-wxvbl\" (UID: \"0cd46437-1e4d-4927-88fe-3d5f18ee621d\") " pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:56:24.862614 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:24.862584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jkb85\"" Apr 17 07:56:24.870520 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:24.870496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wxvbl" Apr 17 07:56:25.009940 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:25.009899 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wxvbl"] Apr 17 07:56:25.013049 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:56:25.013020 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cd46437_1e4d_4927_88fe_3d5f18ee621d.slice/crio-f61b1637d3f620dd862fa92a65c8960179cd33781423c18a0e479058aa319b92 WatchSource:0}: Error finding container f61b1637d3f620dd862fa92a65c8960179cd33781423c18a0e479058aa319b92: Status 404 returned error can't find the container with id f61b1637d3f620dd862fa92a65c8960179cd33781423c18a0e479058aa319b92 Apr 17 07:56:25.717378 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:25.717309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wxvbl" event={"ID":"0cd46437-1e4d-4927-88fe-3d5f18ee621d","Type":"ContainerStarted","Data":"f61b1637d3f620dd862fa92a65c8960179cd33781423c18a0e479058aa319b92"} Apr 17 07:56:27.727385 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:27.727338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wxvbl" event={"ID":"0cd46437-1e4d-4927-88fe-3d5f18ee621d","Type":"ContainerStarted","Data":"8225682bfc71078cc492654eeca4076884e3b26b790243a884cb0ff99dcaa4be"} Apr 17 07:56:27.728009 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:27.727978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wxvbl" event={"ID":"0cd46437-1e4d-4927-88fe-3d5f18ee621d","Type":"ContainerStarted","Data":"7965ddb052ca7d55b2dfb3d993ec237612026159e4b2aa94ab77512b4620e260"} Apr 17 07:56:27.744181 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:27.744133 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wxvbl" podStartSLOduration=254.213926299 podStartE2EDuration="4m15.744117856s" podCreationTimestamp="2026-04-17 07:52:12 +0000 UTC" firstStartedPulling="2026-04-17 07:56:25.015387032 +0000 UTC m=+252.672250224" lastFinishedPulling="2026-04-17 07:56:26.545578589 +0000 UTC m=+254.202441781" observedRunningTime="2026-04-17 07:56:27.742925031 +0000 UTC m=+255.399788245" watchObservedRunningTime="2026-04-17 07:56:27.744117856 +0000 UTC m=+255.400981068" Apr 17 07:56:31.971539 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:31.971507 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:31.986944 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:31.986920 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:32.759109 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:32.759082 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:50.029464 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.029426 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:50.031482 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.031149 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy" containerID="cri-o://119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac" gracePeriod=600 Apr 17 07:56:50.031482 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.031341 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="config-reloader" containerID="cri-o://1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03" gracePeriod=600 Apr 17 07:56:50.031482 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.031357 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="thanos-sidecar" containerID="cri-o://544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3" gracePeriod=600 Apr 17 07:56:50.031482 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.031147 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="prometheus" containerID="cri-o://2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e" gracePeriod=600 Apr 17 07:56:50.031482 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.031414 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-thanos" containerID="cri-o://32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5" gracePeriod=600 Apr 17 07:56:50.031482 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.031461 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-web" containerID="cri-o://fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87" gracePeriod=600 Apr 17 07:56:50.797540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797506 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerID="32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5" exitCode=0 Apr 17 07:56:50.797540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797534 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerID="119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac" exitCode=0 Apr 17 07:56:50.797540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797544 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerID="544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3" exitCode=0 Apr 17 07:56:50.797540 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797551 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerID="1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03" exitCode=0 Apr 17 07:56:50.797810 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797559 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerID="2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e" exitCode=0 Apr 17 07:56:50.797810 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5"} Apr 17 07:56:50.797810 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac"} Apr 17 07:56:50.797810 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3"} Apr 17 07:56:50.797810 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797631 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03"} Apr 17 07:56:50.797810 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:50.797639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e"} Apr 17 07:56:51.265676 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.265651 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.369852 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.369754 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" podUID="7a34eada-e251-4bc7-8937-8f933c0cbd6f" Apr 17 07:56:51.369852 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.369755 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9w6bd" podUID="8ff81f11-f2e2-4838-a775-e57edc28571c" Apr 17 07:56:51.369852 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.369755 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6f2m8" podUID="b1a369df-257c-47a4-96da-3025f897b1dd" Apr 17 07:56:51.385962 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.385930 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-thanos-prometheus-http-client-file\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386081 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.385973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386081 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.385999 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nkz9\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-kube-api-access-6nkz9\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386194 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386168 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-metrics-client-certs\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386250 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386221 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config-out\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386340 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386267 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-serving-certs-ca-bundle\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386410 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386337 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-trusted-ca-bundle\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386410 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386368 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-kube-rbac-proxy\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386510 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386412 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-tls\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386510 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386453 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386510 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-grpc-tls\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386647 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386511 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386647 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386538 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-metrics-client-ca\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386647 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386576 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-kubelet-serving-ca-bundle\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386647 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386608 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-rulefiles-0\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386848 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386652 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-db\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386848 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386676 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-tls-assets\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386848 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386718 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-web-config\") pod \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\" (UID: \"a4c57183-ce96-44f3-9b36-e53c4d00f0e6\") " Apr 17 07:56:51.386848 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386788 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:51.387046 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.386853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:51.387112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.387075 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.387112 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.387099 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.388066 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.388037 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:51.388484 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.388457 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:51.389105 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.389079 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.389922 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.389898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:51.389922 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.389909 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.390294 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.390254 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:51.390399 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.390259 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.390462 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.390433 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.390528 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.390489 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config-out" (OuterVolumeSpecName: "config-out") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:51.390579 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.390525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-kube-api-access-6nkz9" (OuterVolumeSpecName: "kube-api-access-6nkz9") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "kube-api-access-6nkz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:51.390737 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.390711 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.390848 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.390829 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config" (OuterVolumeSpecName: "config") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.391873 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.391839 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.392047 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.392022 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.392420 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.392394 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:51.401928 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.401899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-web-config" (OuterVolumeSpecName: "web-config") pod "a4c57183-ce96-44f3-9b36-e53c4d00f0e6" (UID: "a4c57183-ce96-44f3-9b36-e53c4d00f0e6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:51.488236 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488198 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-kube-rbac-proxy\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488236 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488231 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488245 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488260 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-grpc-tls\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488303 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488316 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-metrics-client-ca\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488330 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488343 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488356 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-prometheus-k8s-db\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488368 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-tls-assets\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488380 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-web-config\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488391 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488405 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488417 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6nkz9\" (UniqueName: \"kubernetes.io/projected/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-kube-api-access-6nkz9\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488430 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-secret-metrics-client-certs\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.488514 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.488442 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4c57183-ce96-44f3-9b36-e53c4d00f0e6-config-out\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 07:56:51.803366 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803261 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerID="fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87" exitCode=0 Apr 17 07:56:51.803517 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6f2m8" Apr 17 07:56:51.803517 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803378 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.803517 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:56:51.803517 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87"} Apr 17 07:56:51.803517 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:56:51.803517 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a4c57183-ce96-44f3-9b36-e53c4d00f0e6","Type":"ContainerDied","Data":"01f2bcc7738ae2cc4810bc886c65fce65a13bc20730ee4e969068caf84a10b3c"} Apr 17 07:56:51.803517 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.803498 2576 scope.go:117] "RemoveContainer" containerID="32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5" Apr 17 07:56:51.811735 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.811716 2576 scope.go:117] "RemoveContainer" containerID="119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac" Apr 17 07:56:51.818398 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.818382 2576 scope.go:117] "RemoveContainer" containerID="fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87" Apr 17 07:56:51.824795 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.824775 2576 scope.go:117] "RemoveContainer" containerID="544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3" Apr 17 07:56:51.827225 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.827185 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:51.831028 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.831008 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:51.832384 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.832369 2576 scope.go:117] "RemoveContainer" containerID="1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03" Apr 17 07:56:51.838634 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.838618 2576 scope.go:117] "RemoveContainer" containerID="2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e" Apr 17 07:56:51.845245 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.845229 2576 scope.go:117] "RemoveContainer" containerID="74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3" Apr 17 07:56:51.851368 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.851354 2576 scope.go:117] "RemoveContainer" containerID="32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5" Apr 17 07:56:51.851636 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.851606 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5\": container with ID starting with 32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5 not found: ID does not exist" containerID="32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5" Apr 17 07:56:51.851679 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.851640 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5"} err="failed to get container status \"32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5\": rpc error: code = NotFound desc = could not find container \"32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5\": container with ID starting with 32db80d63963ec3008ce06e40ea11c33f673d9dbe4eef90592d88a2d2964f8b5 not found: ID does not exist" Apr 17 07:56:51.851679 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.851677 2576 scope.go:117] "RemoveContainer" containerID="119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac" Apr 17 07:56:51.851901 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.851885 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac\": container with ID starting with 119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac not found: ID does not exist" containerID="119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac" Apr 17 07:56:51.851938 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.851909 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac"} err="failed to get container status \"119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac\": rpc error: code = NotFound desc = could not find container \"119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac\": container with ID starting with 119d2a2a6be93908c7f0ea600030d152c321c7a32fa567170340d119886a5dac not found: ID does not exist" Apr 17 07:56:51.851938 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.851927 2576 scope.go:117] "RemoveContainer" containerID="fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87" Apr 17 07:56:51.852139 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.852126 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87\": container with ID starting with fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87 not found: ID does not exist" containerID="fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87" Apr 17 07:56:51.852190 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852141 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87"} err="failed to get container status \"fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87\": rpc error: code = NotFound desc = could not find container \"fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87\": container with ID starting with fdbfa8a473a193609399e178bda1c9d418c2961e0fd720d820e08def53189d87 not found: ID does not exist" Apr 17 07:56:51.852190 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852153 2576 scope.go:117] "RemoveContainer" containerID="544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3" Apr 17 07:56:51.852391 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.852374 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3\": container with ID starting with 544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3 not found: ID does not exist" containerID="544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3" Apr 17 07:56:51.852449 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852399 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3"} err="failed to get container status \"544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3\": rpc error: code = NotFound desc = could not find container \"544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3\": container with ID starting with 544830f11303ccb60d57e7c0a7ee4a377927aaea8efb069a538c91650195f7b3 not found: ID does not exist" Apr 17 07:56:51.852449 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852416 2576 scope.go:117] "RemoveContainer" containerID="1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03" Apr 17 07:56:51.852646 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.852630 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03\": container with ID starting with 1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03 not found: ID does not exist" containerID="1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03" Apr 17 07:56:51.852688 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852660 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03"} err="failed to get container status \"1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03\": rpc error: code = NotFound desc = could not find container \"1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03\": container with ID starting with 1462b1d807898936362946501cbdf764ebb49a7a530dd9d298c01a472576bf03 not found: ID does not exist" Apr 17 07:56:51.852688 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852675 2576 scope.go:117] "RemoveContainer" containerID="2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e" Apr 17 07:56:51.852903 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.852885 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e\": container with ID starting with 2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e not found: ID does not exist" containerID="2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e" Apr 17 07:56:51.852953 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852909 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e"} err="failed to get container status \"2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e\": rpc error: code = NotFound desc = could not find container \"2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e\": container with ID starting with 2c3fb30881f22bdbd57792fc58d7dc6a01e988fc43e0202c47513c23d5002f9e not found: ID does not exist" Apr 17 07:56:51.852953 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.852929 2576 scope.go:117] "RemoveContainer" containerID="74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3" Apr 17 07:56:51.853154 ip-10-0-137-165 kubenswrapper[2576]: E0417 07:56:51.853140 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3\": container with ID starting with 74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3 not found: ID does not exist" containerID="74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3" Apr 17 07:56:51.853196 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.853158 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3"} err="failed to get container status \"74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3\": rpc error: code = NotFound desc = could not find container \"74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3\": container with ID starting with 74ecf940db2b179c535381e7940cbccda4b249a9e7c96e67547392a90b0bb2f3 not found: ID does not exist" Apr 17 07:56:51.858126 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858107 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:51.858468 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858454 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-web" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858470 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-web" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858485 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="thanos-sidecar" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858490 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="thanos-sidecar" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858497 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858502 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858515 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="prometheus" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858519 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="prometheus" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858528 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="init-config-reloader" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858533 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="init-config-reloader" Apr 17 07:56:51.858537 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858539 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-thanos" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858545 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-thanos" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858550 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="config-reloader" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858555 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="config-reloader" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858601 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858609 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="prometheus" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858616 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="config-reloader" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858622 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-thanos" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858628 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="thanos-sidecar" Apr 17 07:56:51.858904 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.858633 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" containerName="kube-rbac-proxy-web" Apr 17 07:56:51.863831 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.863816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.866346 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866327 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8u5ojncutude3\"" Apr 17 07:56:51.866532 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866486 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 07:56:51.866532 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866527 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 07:56:51.866832 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866553 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 07:56:51.866832 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866601 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 07:56:51.866832 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866639 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 07:56:51.866832 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 07:56:51.866832 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.866691 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 07:56:51.867083 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.867065 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-85w9j\"" Apr 17 07:56:51.867083 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.867076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 07:56:51.867206 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.867088 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 07:56:51.867206 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.867113 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:56:51.867206 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.867097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 07:56:51.870657 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.870635 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 07:56:51.872366 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.872350 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 07:56:51.875413 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.875394 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:51.992857 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.992821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993019 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.992884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993019 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.992904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993019 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.992958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993144 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx942\" (UniqueName: \"kubernetes.io/projected/38ce247e-0fc4-4704-abe7-7e265a873052-kube-api-access-wx942\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993144 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993144 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993144 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993144 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38ce247e-0fc4-4704-abe7-7e265a873052-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993325 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993325 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-config\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993325 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993416 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993416 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993416 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993416 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38ce247e-0fc4-4704-abe7-7e265a873052-config-out\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993529 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:51.993529 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:51.993463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-web-config\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094355 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx942\" (UniqueName: \"kubernetes.io/projected/38ce247e-0fc4-4704-abe7-7e265a873052-kube-api-access-wx942\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094495 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094495 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094495 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094495 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38ce247e-0fc4-4704-abe7-7e265a873052-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094495 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-config\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38ce247e-0fc4-4704-abe7-7e265a873052-config-out\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.094725 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095071 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-web-config\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095071 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095071 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095071 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095071 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095071 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.094920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095385 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.095300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095385 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.095352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.095979 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.095954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.097766 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.097582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.097766 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.097634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.097766 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.097646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38ce247e-0fc4-4704-abe7-7e265a873052-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.097766 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.097664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.098035 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.097848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.098150 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.098127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-config\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.098445 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.098422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38ce247e-0fc4-4704-abe7-7e265a873052-config-out\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.098796 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.098772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.098796 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.098786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ce247e-0fc4-4704-abe7-7e265a873052-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.099811 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.099786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-web-config\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.099943 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.099927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.100126 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.100109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.100239 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.100220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/38ce247e-0fc4-4704-abe7-7e265a873052-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.101927 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.101911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx942\" (UniqueName: \"kubernetes.io/projected/38ce247e-0fc4-4704-abe7-7e265a873052-kube-api-access-wx942\") pod \"prometheus-k8s-0\" (UID: \"38ce247e-0fc4-4704-abe7-7e265a873052\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.173680 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.173651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:52.302859 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.302813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:56:52.306246 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:56:52.306208 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ce247e_0fc4_4704_abe7_7e265a873052.slice/crio-2a4a91d5480b3bf1a6a635f23124e1019dd9b4af2cbbc128f43cd708e93c3ca0 WatchSource:0}: Error finding container 2a4a91d5480b3bf1a6a635f23124e1019dd9b4af2cbbc128f43cd708e93c3ca0: Status 404 returned error can't find the container with id 2a4a91d5480b3bf1a6a635f23124e1019dd9b4af2cbbc128f43cd708e93c3ca0 Apr 17 07:56:52.806978 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.806894 2576 generic.go:358] "Generic (PLEG): container finished" podID="38ce247e-0fc4-4704-abe7-7e265a873052" containerID="430398763fb1b9507b67707427f97fa1dae8d169ff0024eb2d76ed322d0892c5" exitCode=0 Apr 17 07:56:52.806978 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.806963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerDied","Data":"430398763fb1b9507b67707427f97fa1dae8d169ff0024eb2d76ed322d0892c5"} Apr 17 07:56:52.807160 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.806997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerStarted","Data":"2a4a91d5480b3bf1a6a635f23124e1019dd9b4af2cbbc128f43cd708e93c3ca0"} Apr 17 07:56:52.862816 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:52.862793 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c57183-ce96-44f3-9b36-e53c4d00f0e6" path="/var/lib/kubelet/pods/a4c57183-ce96-44f3-9b36-e53c4d00f0e6/volumes" Apr 17 07:56:53.814563 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:53.814525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerStarted","Data":"e964e4a4853f47e52b9b5f1c924a5d33756bb3eb0c626d5311e0bc32f95623f2"} Apr 17 07:56:53.815106 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:53.815077 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerStarted","Data":"6867322629c030de1d23c2e32089a1bec94451ee0ffc77d850dce850e4aa63ca"} Apr 17 07:56:53.815239 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:53.815223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerStarted","Data":"5163658f98896bfc915c5b5e842dd4167318bfda0419c1cf3123195e52f1cba3"} Apr 17 07:56:53.815369 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:53.815356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerStarted","Data":"3241a8d5a45aa2dc9769812621f6eefb0c39be3e46948d42b0bc38d76d207555"} Apr 17 07:56:53.815469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:53.815449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerStarted","Data":"7cd3346f24435ef35dbaaa51bb9a296146115d227f2b34555b4bb0c20d91d0ea"} Apr 17 07:56:53.815469 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:53.815469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"38ce247e-0fc4-4704-abe7-7e265a873052","Type":"ContainerStarted","Data":"179f18df97e5689230bf81edbd1377f6701a53c9216f9cd9b2d10fc13a8202bf"} Apr 17 07:56:53.846698 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:53.846632 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.846613691 podStartE2EDuration="2.846613691s" podCreationTimestamp="2026-04-17 07:56:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:56:53.84395459 +0000 UTC m=+281.500817802" watchObservedRunningTime="2026-04-17 07:56:53.846613691 +0000 UTC m=+281.503476906" Apr 17 07:56:55.326037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.325992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:56:55.326037 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.326038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:56:55.326567 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.326064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:56:55.328507 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.328475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1a369df-257c-47a4-96da-3025f897b1dd-metrics-tls\") pod \"dns-default-6f2m8\" (UID: \"b1a369df-257c-47a4-96da-3025f897b1dd\") " pod="openshift-dns/dns-default-6f2m8" Apr 17 07:56:55.328633 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.328561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff81f11-f2e2-4838-a775-e57edc28571c-cert\") pod \"ingress-canary-9w6bd\" (UID: \"8ff81f11-f2e2-4838-a775-e57edc28571c\") " pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:56:55.328633 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.328601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a34eada-e251-4bc7-8937-8f933c0cbd6f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-529rs\" (UID: \"7a34eada-e251-4bc7-8937-8f933c0cbd6f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:56:55.407967 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.407927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d4l2f\"" Apr 17 07:56:55.407967 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.407939 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-lp56s\"" Apr 17 07:56:55.408177 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.408017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fmxgb\"" Apr 17 07:56:55.414920 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.414891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6f2m8" Apr 17 07:56:55.414920 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.414917 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" Apr 17 07:56:55.415085 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.414970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9w6bd" Apr 17 07:56:55.556065 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.556029 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-529rs"] Apr 17 07:56:55.560148 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:56:55.560108 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a34eada_e251_4bc7_8937_8f933c0cbd6f.slice/crio-c8ac96b7ba727a661922b4f7daaa2ccefa5c039ac2d425c1cb481c8f78550b42 WatchSource:0}: Error finding container c8ac96b7ba727a661922b4f7daaa2ccefa5c039ac2d425c1cb481c8f78550b42: Status 404 returned error can't find the container with id c8ac96b7ba727a661922b4f7daaa2ccefa5c039ac2d425c1cb481c8f78550b42 Apr 17 07:56:55.576170 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.576085 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9w6bd"] Apr 17 07:56:55.579737 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:56:55.579714 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff81f11_f2e2_4838_a775_e57edc28571c.slice/crio-5f4c5ccd407933691c8a3fac99b43cd1f9d5e8252f53cbde77d558faac868583 WatchSource:0}: Error finding container 5f4c5ccd407933691c8a3fac99b43cd1f9d5e8252f53cbde77d558faac868583: Status 404 returned error can't find the container with id 5f4c5ccd407933691c8a3fac99b43cd1f9d5e8252f53cbde77d558faac868583 Apr 17 07:56:55.598869 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.598844 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6f2m8"] Apr 17 07:56:55.602733 ip-10-0-137-165 kubenswrapper[2576]: W0417 07:56:55.602703 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a369df_257c_47a4_96da_3025f897b1dd.slice/crio-c24a69d27434c5e393547576079688a4ab822ca6a5ee6ec14b31de8c3f9e372b WatchSource:0}: Error finding container c24a69d27434c5e393547576079688a4ab822ca6a5ee6ec14b31de8c3f9e372b: Status 404 returned error can't find the container with id c24a69d27434c5e393547576079688a4ab822ca6a5ee6ec14b31de8c3f9e372b Apr 17 07:56:55.821769 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.821730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" event={"ID":"7a34eada-e251-4bc7-8937-8f933c0cbd6f","Type":"ContainerStarted","Data":"c8ac96b7ba727a661922b4f7daaa2ccefa5c039ac2d425c1cb481c8f78550b42"} Apr 17 07:56:55.822708 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.822683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6f2m8" event={"ID":"b1a369df-257c-47a4-96da-3025f897b1dd","Type":"ContainerStarted","Data":"c24a69d27434c5e393547576079688a4ab822ca6a5ee6ec14b31de8c3f9e372b"} Apr 17 07:56:55.823580 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:55.823561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9w6bd" event={"ID":"8ff81f11-f2e2-4838-a775-e57edc28571c","Type":"ContainerStarted","Data":"5f4c5ccd407933691c8a3fac99b43cd1f9d5e8252f53cbde77d558faac868583"} Apr 17 07:56:56.828662 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:56.828627 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" event={"ID":"7a34eada-e251-4bc7-8937-8f933c0cbd6f","Type":"ContainerStarted","Data":"03bf124a66c06769e474e3db4c0e2b5c7598ba34409c555abdba41bc47e8820c"} Apr 17 07:56:56.842701 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:56.842645 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-529rs" podStartSLOduration=281.734579119 podStartE2EDuration="4m42.842629391s" podCreationTimestamp="2026-04-17 07:52:14 +0000 UTC" firstStartedPulling="2026-04-17 07:56:55.562594298 +0000 UTC m=+283.219457489" lastFinishedPulling="2026-04-17 07:56:56.670644557 +0000 UTC m=+284.327507761" observedRunningTime="2026-04-17 07:56:56.84186644 +0000 UTC m=+284.498729657" watchObservedRunningTime="2026-04-17 07:56:56.842629391 +0000 UTC m=+284.499492607" Apr 17 07:56:57.174665 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:57.174623 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:57.834013 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:57.833970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6f2m8" event={"ID":"b1a369df-257c-47a4-96da-3025f897b1dd","Type":"ContainerStarted","Data":"6b7ea5cfab2621bea875480987f718b6e8b37c33a8ceacc51dbc19ef29bac1aa"} Apr 17 07:56:57.835707 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:57.835676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9w6bd" event={"ID":"8ff81f11-f2e2-4838-a775-e57edc28571c","Type":"ContainerStarted","Data":"6198a078eea1c7639ad13bb3a0de878513e16a1c04ede1d0b64acc9208587164"} Apr 17 07:56:57.850077 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:57.849633 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9w6bd" podStartSLOduration=250.802495228 podStartE2EDuration="4m12.849616101s" podCreationTimestamp="2026-04-17 07:52:45 +0000 UTC" firstStartedPulling="2026-04-17 07:56:55.581106676 +0000 UTC m=+283.237969870" lastFinishedPulling="2026-04-17 07:56:57.628227549 +0000 UTC m=+285.285090743" observedRunningTime="2026-04-17 07:56:57.849251296 +0000 UTC m=+285.506114502" watchObservedRunningTime="2026-04-17 07:56:57.849616101 +0000 UTC m=+285.506479315" Apr 17 07:56:58.840004 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:58.839960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6f2m8" event={"ID":"b1a369df-257c-47a4-96da-3025f897b1dd","Type":"ContainerStarted","Data":"2e079ddced7ed3bea36be5e33b8efc10c6e3ca34e4498aac3f8d4c80629f97b1"} Apr 17 07:56:59.842865 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:56:59.842830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6f2m8" Apr 17 07:57:09.848162 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:09.848128 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6f2m8" Apr 17 07:57:09.873444 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:09.873393 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6f2m8" podStartSLOduration=262.847041059 podStartE2EDuration="4m24.873377417s" podCreationTimestamp="2026-04-17 07:52:45 +0000 UTC" firstStartedPulling="2026-04-17 07:56:55.604473135 +0000 UTC m=+283.261336326" lastFinishedPulling="2026-04-17 07:56:57.630809493 +0000 UTC m=+285.287672684" observedRunningTime="2026-04-17 07:56:58.861446722 +0000 UTC m=+286.518309937" watchObservedRunningTime="2026-04-17 07:57:09.873377417 +0000 UTC m=+297.530240629" Apr 17 07:57:12.778397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:12.778363 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 07:57:12.778397 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:12.778384 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 07:57:12.787294 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:12.787250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 07:57:12.787426 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:12.787250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 07:57:52.174858 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:52.174814 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:57:52.189889 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:52.189864 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:57:53.014423 ip-10-0-137-165 kubenswrapper[2576]: I0417 07:57:53.010157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:01:59.336974 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.336935 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-m77jz"] Apr 17 08:01:59.340228 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.340212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-m77jz" Apr 17 08:01:59.342527 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.342503 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 08:01:59.342740 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.342724 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:01:59.343528 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.343509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:01:59.343621 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.343592 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-77vl8\"" Apr 17 08:01:59.345703 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.345682 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-m77jz"] Apr 17 08:01:59.372358 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.372326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmv4w\" (UniqueName: \"kubernetes.io/projected/83398091-9227-415c-b435-dca3c86f22ff-kube-api-access-pmv4w\") pod \"s3-init-m77jz\" (UID: \"83398091-9227-415c-b435-dca3c86f22ff\") " pod="kserve/s3-init-m77jz" Apr 17 08:01:59.473340 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.473269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmv4w\" (UniqueName: \"kubernetes.io/projected/83398091-9227-415c-b435-dca3c86f22ff-kube-api-access-pmv4w\") pod \"s3-init-m77jz\" (UID: \"83398091-9227-415c-b435-dca3c86f22ff\") " pod="kserve/s3-init-m77jz" Apr 17 08:01:59.481361 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.481335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmv4w\" (UniqueName: \"kubernetes.io/projected/83398091-9227-415c-b435-dca3c86f22ff-kube-api-access-pmv4w\") pod \"s3-init-m77jz\" (UID: \"83398091-9227-415c-b435-dca3c86f22ff\") " pod="kserve/s3-init-m77jz" Apr 17 08:01:59.656895 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.656819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-m77jz" Apr 17 08:01:59.773111 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.773080 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-m77jz"] Apr 17 08:01:59.776416 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:01:59.776389 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83398091_9227_415c_b435_dca3c86f22ff.slice/crio-d3cd6575f1065b8b9210fc131e6a1a09b5ce14cfdf84c2e5f955a4f7a0c77e42 WatchSource:0}: Error finding container d3cd6575f1065b8b9210fc131e6a1a09b5ce14cfdf84c2e5f955a4f7a0c77e42: Status 404 returned error can't find the container with id d3cd6575f1065b8b9210fc131e6a1a09b5ce14cfdf84c2e5f955a4f7a0c77e42 Apr 17 08:01:59.778582 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:01:59.778565 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:02:00.714151 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:00.714099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-m77jz" event={"ID":"83398091-9227-415c-b435-dca3c86f22ff","Type":"ContainerStarted","Data":"d3cd6575f1065b8b9210fc131e6a1a09b5ce14cfdf84c2e5f955a4f7a0c77e42"} Apr 17 08:02:04.727865 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:04.727828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-m77jz" event={"ID":"83398091-9227-415c-b435-dca3c86f22ff","Type":"ContainerStarted","Data":"1eb85611a7a59d0bbc6f3d86cd79359cf84fb018b9c6e266e02b7981bd9f6b56"} Apr 17 08:02:04.743066 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:04.743022 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-m77jz" podStartSLOduration=1.366707866 podStartE2EDuration="5.743006534s" podCreationTimestamp="2026-04-17 08:01:59 +0000 UTC" firstStartedPulling="2026-04-17 08:01:59.778715859 +0000 UTC m=+587.435579050" lastFinishedPulling="2026-04-17 08:02:04.155014513 +0000 UTC m=+591.811877718" observedRunningTime="2026-04-17 08:02:04.741914068 +0000 UTC m=+592.398777280" watchObservedRunningTime="2026-04-17 08:02:04.743006534 +0000 UTC m=+592.399869747" Apr 17 08:02:07.737605 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:07.737569 2576 generic.go:358] "Generic (PLEG): container finished" podID="83398091-9227-415c-b435-dca3c86f22ff" containerID="1eb85611a7a59d0bbc6f3d86cd79359cf84fb018b9c6e266e02b7981bd9f6b56" exitCode=0 Apr 17 08:02:07.738026 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:07.737645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-m77jz" event={"ID":"83398091-9227-415c-b435-dca3c86f22ff","Type":"ContainerDied","Data":"1eb85611a7a59d0bbc6f3d86cd79359cf84fb018b9c6e266e02b7981bd9f6b56"} Apr 17 08:02:08.857477 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:08.857456 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-m77jz" Apr 17 08:02:08.959500 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:08.959468 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmv4w\" (UniqueName: \"kubernetes.io/projected/83398091-9227-415c-b435-dca3c86f22ff-kube-api-access-pmv4w\") pod \"83398091-9227-415c-b435-dca3c86f22ff\" (UID: \"83398091-9227-415c-b435-dca3c86f22ff\") " Apr 17 08:02:08.961549 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:08.961522 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83398091-9227-415c-b435-dca3c86f22ff-kube-api-access-pmv4w" (OuterVolumeSpecName: "kube-api-access-pmv4w") pod "83398091-9227-415c-b435-dca3c86f22ff" (UID: "83398091-9227-415c-b435-dca3c86f22ff"). InnerVolumeSpecName "kube-api-access-pmv4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:02:09.060138 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:09.060056 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pmv4w\" (UniqueName: \"kubernetes.io/projected/83398091-9227-415c-b435-dca3c86f22ff-kube-api-access-pmv4w\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 08:02:09.744756 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:09.744725 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-m77jz" Apr 17 08:02:09.744943 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:09.744730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-m77jz" event={"ID":"83398091-9227-415c-b435-dca3c86f22ff","Type":"ContainerDied","Data":"d3cd6575f1065b8b9210fc131e6a1a09b5ce14cfdf84c2e5f955a4f7a0c77e42"} Apr 17 08:02:09.744943 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:09.744842 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3cd6575f1065b8b9210fc131e6a1a09b5ce14cfdf84c2e5f955a4f7a0c77e42" Apr 17 08:02:12.805055 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:12.805022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:02:12.805563 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:12.805236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:02:12.809519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:12.809501 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:02:12.809624 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:12.809602 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:02:19.071996 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.071946 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5"] Apr 17 08:02:19.072442 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.072426 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83398091-9227-415c-b435-dca3c86f22ff" containerName="s3-init" Apr 17 08:02:19.072500 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.072445 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83398091-9227-415c-b435-dca3c86f22ff" containerName="s3-init" Apr 17 08:02:19.072538 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.072516 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="83398091-9227-415c-b435-dca3c86f22ff" containerName="s3-init" Apr 17 08:02:19.076726 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.076708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" Apr 17 08:02:19.078926 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.078908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6w6b9\"" Apr 17 08:02:19.082049 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.082026 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5"] Apr 17 08:02:19.090411 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.090391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" Apr 17 08:02:19.242728 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.242702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5"] Apr 17 08:02:19.245378 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:02:19.245323 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e54b0b_98b9_4519_a041_63eaebb24609.slice/crio-b19faea0085b20a67e46ccbdcc49743b5e2aa15139a8e9da13355768109595d4 WatchSource:0}: Error finding container b19faea0085b20a67e46ccbdcc49743b5e2aa15139a8e9da13355768109595d4: Status 404 returned error can't find the container with id b19faea0085b20a67e46ccbdcc49743b5e2aa15139a8e9da13355768109595d4 Apr 17 08:02:19.777672 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:19.777633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" event={"ID":"d6e54b0b-98b9-4519-a041-63eaebb24609","Type":"ContainerStarted","Data":"b19faea0085b20a67e46ccbdcc49743b5e2aa15139a8e9da13355768109595d4"} Apr 17 08:02:32.818788 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:32.818699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" event={"ID":"d6e54b0b-98b9-4519-a041-63eaebb24609","Type":"ContainerStarted","Data":"7f86e0674d66f408abfd359eecabcba0ddb1c5d9fa931cb2f9c54808b494dc02"} Apr 17 08:02:32.819226 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:32.818910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" Apr 17 08:02:32.820293 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:32.820249 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:02:32.866066 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:32.866018 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podStartSLOduration=0.657536533 podStartE2EDuration="13.866001109s" podCreationTimestamp="2026-04-17 08:02:19 +0000 UTC" firstStartedPulling="2026-04-17 08:02:19.247429096 +0000 UTC m=+606.904292287" lastFinishedPulling="2026-04-17 08:02:32.455893665 +0000 UTC m=+620.112756863" observedRunningTime="2026-04-17 08:02:32.863988178 +0000 UTC m=+620.520851391" watchObservedRunningTime="2026-04-17 08:02:32.866001109 +0000 UTC m=+620.522864322" Apr 17 08:02:33.821607 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:33.821567 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:02:43.822532 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:43.822486 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:02:53.821958 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:02:53.821904 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:03:03.822130 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:03.822089 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:03:13.822208 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:13.822168 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:03:23.822849 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:23.822818 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" Apr 17 08:03:53.310057 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.310021 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5"] Apr 17 08:03:53.310577 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.310415 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" containerID="cri-o://7f86e0674d66f408abfd359eecabcba0ddb1c5d9fa931cb2f9c54808b494dc02" gracePeriod=30 Apr 17 08:03:53.354676 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.354622 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v"] Apr 17 08:03:53.358666 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.358632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" Apr 17 08:03:53.368791 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.368590 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v"] Apr 17 08:03:53.374655 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.374633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" Apr 17 08:03:53.510882 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.510858 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v"] Apr 17 08:03:53.513423 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:03:53.513392 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74f58849_9030_4931_9db3_1b2cd2a0088b.slice/crio-177e3535c2322c8420aad6d0be339311258eb90aceb676a8f9031392d87e1579 WatchSource:0}: Error finding container 177e3535c2322c8420aad6d0be339311258eb90aceb676a8f9031392d87e1579: Status 404 returned error can't find the container with id 177e3535c2322c8420aad6d0be339311258eb90aceb676a8f9031392d87e1579 Apr 17 08:03:53.821896 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:53.821798 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 17 08:03:54.052752 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:54.052710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" event={"ID":"74f58849-9030-4931-9db3-1b2cd2a0088b","Type":"ContainerStarted","Data":"08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2"} Apr 17 08:03:54.052752 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:54.052749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" event={"ID":"74f58849-9030-4931-9db3-1b2cd2a0088b","Type":"ContainerStarted","Data":"177e3535c2322c8420aad6d0be339311258eb90aceb676a8f9031392d87e1579"} Apr 17 08:03:54.052977 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:54.052863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" Apr 17 08:03:54.053954 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:54.053930 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:03:54.066608 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:54.066561 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podStartSLOduration=1.06654999 podStartE2EDuration="1.06654999s" podCreationTimestamp="2026-04-17 08:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:03:54.065487677 +0000 UTC m=+701.722350890" watchObservedRunningTime="2026-04-17 08:03:54.06654999 +0000 UTC m=+701.723413181" Apr 17 08:03:55.055804 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:55.055767 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:03:56.060181 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:56.060156 2576 generic.go:358] "Generic (PLEG): container finished" podID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerID="7f86e0674d66f408abfd359eecabcba0ddb1c5d9fa931cb2f9c54808b494dc02" exitCode=0 Apr 17 08:03:56.060537 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:56.060234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" event={"ID":"d6e54b0b-98b9-4519-a041-63eaebb24609","Type":"ContainerDied","Data":"7f86e0674d66f408abfd359eecabcba0ddb1c5d9fa931cb2f9c54808b494dc02"} Apr 17 08:03:56.151269 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:56.151248 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" Apr 17 08:03:57.064360 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:57.064324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" event={"ID":"d6e54b0b-98b9-4519-a041-63eaebb24609","Type":"ContainerDied","Data":"b19faea0085b20a67e46ccbdcc49743b5e2aa15139a8e9da13355768109595d4"} Apr 17 08:03:57.064763 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:57.064369 2576 scope.go:117] "RemoveContainer" containerID="7f86e0674d66f408abfd359eecabcba0ddb1c5d9fa931cb2f9c54808b494dc02" Apr 17 08:03:57.064763 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:57.064344 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5" Apr 17 08:03:57.080966 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:57.080941 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5"] Apr 17 08:03:57.083089 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:57.083069 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-08159-predictor-d94d646b6-qnkn5"] Apr 17 08:03:58.862493 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:03:58.862459 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" path="/var/lib/kubelet/pods/d6e54b0b-98b9-4519-a041-63eaebb24609/volumes" Apr 17 08:04:05.056412 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:05.056372 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:04:15.056299 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:15.056232 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:04:25.056133 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:25.056089 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:04:29.175241 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.175155 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf"] Apr 17 08:04:29.175711 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.175638 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" Apr 17 08:04:29.175711 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.175654 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" Apr 17 08:04:29.175826 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.175739 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6e54b0b-98b9-4519-a041-63eaebb24609" containerName="kserve-container" Apr 17 08:04:29.180221 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.180200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" Apr 17 08:04:29.188675 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.188651 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf"] Apr 17 08:04:29.190159 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.190142 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" Apr 17 08:04:29.332782 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:29.332759 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf"] Apr 17 08:04:29.335034 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:04:29.335005 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf732ca_639c_4439_94bd_0c696e6a014e.slice/crio-decf6a1a2672830efff19b4279d610d5da16587ff0bde4db07c9e17e84785da5 WatchSource:0}: Error finding container decf6a1a2672830efff19b4279d610d5da16587ff0bde4db07c9e17e84785da5: Status 404 returned error can't find the container with id decf6a1a2672830efff19b4279d610d5da16587ff0bde4db07c9e17e84785da5 Apr 17 08:04:30.167863 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:30.167818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" event={"ID":"eaf732ca-639c-4439-94bd-0c696e6a014e","Type":"ContainerStarted","Data":"12c719392accf8fef1928f7485509ab7bb52e92efba41eb13b0aaa56c167b5ae"} Apr 17 08:04:30.167863 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:30.167867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" event={"ID":"eaf732ca-639c-4439-94bd-0c696e6a014e","Type":"ContainerStarted","Data":"decf6a1a2672830efff19b4279d610d5da16587ff0bde4db07c9e17e84785da5"} Apr 17 08:04:30.168137 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:30.168055 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" Apr 17 08:04:30.169186 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:30.169162 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:04:30.182067 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:30.182024 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podStartSLOduration=1.182014445 podStartE2EDuration="1.182014445s" podCreationTimestamp="2026-04-17 08:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:04:30.181397174 +0000 UTC m=+737.838260388" watchObservedRunningTime="2026-04-17 08:04:30.182014445 +0000 UTC m=+737.838877657" Apr 17 08:04:31.171372 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:31.171331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:04:35.056211 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:35.056168 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 17 08:04:41.171818 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:41.171775 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:04:45.057405 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:45.057376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" Apr 17 08:04:51.171488 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:04:51.171445 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:05:01.171844 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:05:01.171801 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:05:11.171813 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:05:11.171765 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 17 08:05:21.173407 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:05:21.173376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" Apr 17 08:07:12.826139 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:07:12.826072 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:07:12.827773 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:07:12.827746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:07:12.830944 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:07:12.830913 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:07:12.832433 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:07:12.832418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:12:12.847967 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:12:12.847918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:12:12.849996 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:12:12.849976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:12:12.852078 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:12:12.852058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:12:12.853866 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:12:12.853843 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:13:18.242974 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.242943 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v"] Apr 17 08:13:18.243551 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.243192 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" containerID="cri-o://08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2" gracePeriod=30 Apr 17 08:13:18.308978 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.308950 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr"] Apr 17 08:13:18.312321 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.312305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" Apr 17 08:13:18.322178 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.322161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" Apr 17 08:13:18.328645 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.328621 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr"] Apr 17 08:13:18.451906 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.451882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr"] Apr 17 08:13:18.455119 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:13:18.455087 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80882535_b938_4c10_8df6_f9e0c65e9c9d.slice/crio-8d04a128e392b1b69bdb6efba707f3864acb2e0ccb692409951509bf0088437a WatchSource:0}: Error finding container 8d04a128e392b1b69bdb6efba707f3864acb2e0ccb692409951509bf0088437a: Status 404 returned error can't find the container with id 8d04a128e392b1b69bdb6efba707f3864acb2e0ccb692409951509bf0088437a Apr 17 08:13:18.457398 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.457382 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:13:18.744785 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.744717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" event={"ID":"80882535-b938-4c10-8df6-f9e0c65e9c9d","Type":"ContainerStarted","Data":"b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397"} Apr 17 08:13:18.744785 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.744750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" event={"ID":"80882535-b938-4c10-8df6-f9e0c65e9c9d","Type":"ContainerStarted","Data":"8d04a128e392b1b69bdb6efba707f3864acb2e0ccb692409951509bf0088437a"} Apr 17 08:13:18.744951 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.744873 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" Apr 17 08:13:18.745888 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.745865 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:13:18.759989 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:18.759931 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podStartSLOduration=0.759918137 podStartE2EDuration="759.918137ms" podCreationTimestamp="2026-04-17 08:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:13:18.757749407 +0000 UTC m=+1266.414612620" watchObservedRunningTime="2026-04-17 08:13:18.759918137 +0000 UTC m=+1266.416781349" Apr 17 08:13:19.748338 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:19.748296 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:13:20.885877 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:20.885855 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" Apr 17 08:13:21.759792 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.759753 2576 generic.go:358] "Generic (PLEG): container finished" podID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerID="08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2" exitCode=0 Apr 17 08:13:21.760004 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.759842 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" Apr 17 08:13:21.760004 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.759842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" event={"ID":"74f58849-9030-4931-9db3-1b2cd2a0088b","Type":"ContainerDied","Data":"08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2"} Apr 17 08:13:21.760004 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.759884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v" event={"ID":"74f58849-9030-4931-9db3-1b2cd2a0088b","Type":"ContainerDied","Data":"177e3535c2322c8420aad6d0be339311258eb90aceb676a8f9031392d87e1579"} Apr 17 08:13:21.760004 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.759899 2576 scope.go:117] "RemoveContainer" containerID="08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2" Apr 17 08:13:21.768752 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.768737 2576 scope.go:117] "RemoveContainer" containerID="08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2" Apr 17 08:13:21.769008 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:13:21.768989 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2\": container with ID starting with 08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2 not found: ID does not exist" containerID="08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2" Apr 17 08:13:21.769076 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.769019 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2"} err="failed to get container status \"08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2\": rpc error: code = NotFound desc = could not find container \"08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2\": container with ID starting with 08e604e800b0684f4adba45f2d80acff5cf6dbc12854f1f1c52c5458be76e7e2 not found: ID does not exist" Apr 17 08:13:21.778997 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.778974 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v"] Apr 17 08:13:21.783038 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:21.783019 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4828d-predictor-6ddfbf8d5b-fmz9v"] Apr 17 08:13:22.862479 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:22.862447 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" path="/var/lib/kubelet/pods/74f58849-9030-4931-9db3-1b2cd2a0088b/volumes" Apr 17 08:13:29.749030 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:29.748990 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:13:39.748861 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:39.748822 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:13:49.749005 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:49.748963 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:13:54.063422 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.063386 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf"] Apr 17 08:13:54.063890 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.063672 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" containerID="cri-o://12c719392accf8fef1928f7485509ab7bb52e92efba41eb13b0aaa56c167b5ae" gracePeriod=30 Apr 17 08:13:54.095607 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.095580 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl"] Apr 17 08:13:54.095929 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.095914 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" Apr 17 08:13:54.095983 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.095930 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" Apr 17 08:13:54.096020 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.095990 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="74f58849-9030-4931-9db3-1b2cd2a0088b" containerName="kserve-container" Apr 17 08:13:54.099099 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.099084 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" Apr 17 08:13:54.109299 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.108664 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl"] Apr 17 08:13:54.113387 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.113365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" Apr 17 08:13:54.243606 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.243518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl"] Apr 17 08:13:54.246310 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:13:54.246265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded31a636_b28e_41cc_80a5_c62b82f1c579.slice/crio-7e50ff539d6030490a6ffa77061000d87c70c613e73adf796cb3ca9b6f6376a3 WatchSource:0}: Error finding container 7e50ff539d6030490a6ffa77061000d87c70c613e73adf796cb3ca9b6f6376a3: Status 404 returned error can't find the container with id 7e50ff539d6030490a6ffa77061000d87c70c613e73adf796cb3ca9b6f6376a3 Apr 17 08:13:54.865583 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.865549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" event={"ID":"ed31a636-b28e-41cc-80a5-c62b82f1c579","Type":"ContainerStarted","Data":"aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc"} Apr 17 08:13:54.865777 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.865590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" event={"ID":"ed31a636-b28e-41cc-80a5-c62b82f1c579","Type":"ContainerStarted","Data":"7e50ff539d6030490a6ffa77061000d87c70c613e73adf796cb3ca9b6f6376a3"} Apr 17 08:13:54.865851 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.865839 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" Apr 17 08:13:54.867034 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.867000 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:13:54.879372 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:54.879329 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podStartSLOduration=0.879316769 podStartE2EDuration="879.316769ms" podCreationTimestamp="2026-04-17 08:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:13:54.878190641 +0000 UTC m=+1302.535053853" watchObservedRunningTime="2026-04-17 08:13:54.879316769 +0000 UTC m=+1302.536179985" Apr 17 08:13:55.869228 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:55.869194 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:13:56.872618 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:56.872582 2576 generic.go:358] "Generic (PLEG): container finished" podID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerID="12c719392accf8fef1928f7485509ab7bb52e92efba41eb13b0aaa56c167b5ae" exitCode=0 Apr 17 08:13:56.872966 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:56.872654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" event={"ID":"eaf732ca-639c-4439-94bd-0c696e6a014e","Type":"ContainerDied","Data":"12c719392accf8fef1928f7485509ab7bb52e92efba41eb13b0aaa56c167b5ae"} Apr 17 08:13:57.315006 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:57.314987 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" Apr 17 08:13:57.876678 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:57.876640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" event={"ID":"eaf732ca-639c-4439-94bd-0c696e6a014e","Type":"ContainerDied","Data":"decf6a1a2672830efff19b4279d610d5da16587ff0bde4db07c9e17e84785da5"} Apr 17 08:13:57.876678 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:57.876664 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf" Apr 17 08:13:57.876678 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:57.876680 2576 scope.go:117] "RemoveContainer" containerID="12c719392accf8fef1928f7485509ab7bb52e92efba41eb13b0aaa56c167b5ae" Apr 17 08:13:57.896799 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:57.896746 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf"] Apr 17 08:13:57.899103 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:57.899082 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c3bdf-predictor-6fc8445484-6hzdf"] Apr 17 08:13:58.863500 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:58.863464 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" path="/var/lib/kubelet/pods/eaf732ca-639c-4439-94bd-0c696e6a014e/volumes" Apr 17 08:13:59.748558 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:13:59.748517 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:14:05.869510 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:05.869472 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:14:09.749267 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:09.749237 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" Apr 17 08:14:15.869244 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:15.869202 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:14:25.869856 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:25.869816 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:14:35.869290 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:35.869249 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:14:38.546651 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.546617 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr"] Apr 17 08:14:38.547006 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.546912 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" containerID="cri-o://b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397" gracePeriod=30 Apr 17 08:14:38.561250 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.561222 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk"] Apr 17 08:14:38.561623 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.561606 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" Apr 17 08:14:38.561695 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.561626 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" Apr 17 08:14:38.561751 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.561713 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="eaf732ca-639c-4439-94bd-0c696e6a014e" containerName="kserve-container" Apr 17 08:14:38.564653 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.564633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" Apr 17 08:14:38.571771 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.571398 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk"] Apr 17 08:14:38.574817 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.574797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" Apr 17 08:14:38.713326 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:38.713252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk"] Apr 17 08:14:38.716687 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:14:38.716655 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db16dfd_98b0_46d1_a6a1_779cec048066.slice/crio-87547324a2bee13555be846ec2cee6f2d92a927f5a5afca17fe20f796daa1251 WatchSource:0}: Error finding container 87547324a2bee13555be846ec2cee6f2d92a927f5a5afca17fe20f796daa1251: Status 404 returned error can't find the container with id 87547324a2bee13555be846ec2cee6f2d92a927f5a5afca17fe20f796daa1251 Apr 17 08:14:39.009692 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:39.009659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" event={"ID":"9db16dfd-98b0-46d1-a6a1-779cec048066","Type":"ContainerStarted","Data":"316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40"} Apr 17 08:14:39.009692 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:39.009692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" event={"ID":"9db16dfd-98b0-46d1-a6a1-779cec048066","Type":"ContainerStarted","Data":"87547324a2bee13555be846ec2cee6f2d92a927f5a5afca17fe20f796daa1251"} Apr 17 08:14:39.009926 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:39.009872 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" Apr 17 08:14:39.011139 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:39.011111 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:14:39.025203 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:39.025166 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podStartSLOduration=1.025153964 podStartE2EDuration="1.025153964s" podCreationTimestamp="2026-04-17 08:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:14:39.023165061 +0000 UTC m=+1346.680028299" watchObservedRunningTime="2026-04-17 08:14:39.025153964 +0000 UTC m=+1346.682017178" Apr 17 08:14:39.749328 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:39.749265 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 17 08:14:40.012850 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:40.012763 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:14:41.388799 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:41.388779 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" Apr 17 08:14:42.018826 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.018794 2576 generic.go:358] "Generic (PLEG): container finished" podID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerID="b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397" exitCode=0 Apr 17 08:14:42.018997 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.018839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" event={"ID":"80882535-b938-4c10-8df6-f9e0c65e9c9d","Type":"ContainerDied","Data":"b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397"} Apr 17 08:14:42.018997 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.018853 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" Apr 17 08:14:42.018997 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.018870 2576 scope.go:117] "RemoveContainer" containerID="b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397" Apr 17 08:14:42.018997 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.018859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr" event={"ID":"80882535-b938-4c10-8df6-f9e0c65e9c9d","Type":"ContainerDied","Data":"8d04a128e392b1b69bdb6efba707f3864acb2e0ccb692409951509bf0088437a"} Apr 17 08:14:42.026824 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.026808 2576 scope.go:117] "RemoveContainer" containerID="b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397" Apr 17 08:14:42.027080 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:14:42.027040 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397\": container with ID starting with b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397 not found: ID does not exist" containerID="b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397" Apr 17 08:14:42.027131 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.027087 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397"} err="failed to get container status \"b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397\": rpc error: code = NotFound desc = could not find container \"b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397\": container with ID starting with b90c4628123327723fc99f93139769e504024568e614cb230501bb81079fa397 not found: ID does not exist" Apr 17 08:14:42.037318 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.037288 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr"] Apr 17 08:14:42.041243 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.041224 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390bf-predictor-695484878d-z98cr"] Apr 17 08:14:42.862766 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:42.862728 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" path="/var/lib/kubelet/pods/80882535-b938-4c10-8df6-f9e0c65e9c9d/volumes" Apr 17 08:14:45.869946 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:45.869919 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" Apr 17 08:14:50.012896 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:14:50.012854 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:15:00.013876 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:00.013834 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:15:10.013710 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:10.013671 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:15:14.343579 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.343547 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl"] Apr 17 08:15:14.343941 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.343788 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" containerID="cri-o://aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc" gracePeriod=30 Apr 17 08:15:14.351901 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.351879 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q"] Apr 17 08:15:14.352208 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.352195 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" Apr 17 08:15:14.352253 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.352211 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" Apr 17 08:15:14.352307 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.352270 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="80882535-b938-4c10-8df6-f9e0c65e9c9d" containerName="kserve-container" Apr 17 08:15:14.356342 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.356325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" Apr 17 08:15:14.365346 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.365326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" Apr 17 08:15:14.373074 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.373050 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q"] Apr 17 08:15:14.490776 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:14.490748 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q"] Apr 17 08:15:14.496169 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:15:14.496109 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f094e49_fecd_4e33_8e15_a761ff85682c.slice/crio-1bdca647168cd52869ce05914aa40ddf76446292bf5b91eb56d910106852c24c WatchSource:0}: Error finding container 1bdca647168cd52869ce05914aa40ddf76446292bf5b91eb56d910106852c24c: Status 404 returned error can't find the container with id 1bdca647168cd52869ce05914aa40ddf76446292bf5b91eb56d910106852c24c Apr 17 08:15:15.114731 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:15.114696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" event={"ID":"7f094e49-fecd-4e33-8e15-a761ff85682c","Type":"ContainerStarted","Data":"0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63"} Apr 17 08:15:15.114731 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:15.114732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" event={"ID":"7f094e49-fecd-4e33-8e15-a761ff85682c","Type":"ContainerStarted","Data":"1bdca647168cd52869ce05914aa40ddf76446292bf5b91eb56d910106852c24c"} Apr 17 08:15:15.114983 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:15.114918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" Apr 17 08:15:15.116308 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:15.116269 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:15:15.129533 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:15.129495 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podStartSLOduration=1.129483218 podStartE2EDuration="1.129483218s" podCreationTimestamp="2026-04-17 08:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:15:15.127941489 +0000 UTC m=+1382.784804702" watchObservedRunningTime="2026-04-17 08:15:15.129483218 +0000 UTC m=+1382.786346793" Apr 17 08:15:15.870027 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:15.869981 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 17 08:15:16.118381 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:16.118329 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:15:17.375605 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:17.375582 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" Apr 17 08:15:18.124322 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.124290 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerID="aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc" exitCode=0 Apr 17 08:15:18.124516 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.124358 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" Apr 17 08:15:18.124516 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.124361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" event={"ID":"ed31a636-b28e-41cc-80a5-c62b82f1c579","Type":"ContainerDied","Data":"aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc"} Apr 17 08:15:18.124516 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.124405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl" event={"ID":"ed31a636-b28e-41cc-80a5-c62b82f1c579","Type":"ContainerDied","Data":"7e50ff539d6030490a6ffa77061000d87c70c613e73adf796cb3ca9b6f6376a3"} Apr 17 08:15:18.124516 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.124421 2576 scope.go:117] "RemoveContainer" containerID="aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc" Apr 17 08:15:18.132603 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.132384 2576 scope.go:117] "RemoveContainer" containerID="aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc" Apr 17 08:15:18.132744 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:15:18.132630 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc\": container with ID starting with aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc not found: ID does not exist" containerID="aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc" Apr 17 08:15:18.132744 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.132667 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc"} err="failed to get container status \"aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc\": rpc error: code = NotFound desc = could not find container \"aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc\": container with ID starting with aaa67c42524b25b05476dfdbee553b08411e63e9267193ab8c55875ddad109cc not found: ID does not exist" Apr 17 08:15:18.145314 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.145293 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl"] Apr 17 08:15:18.148069 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.148051 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-375fe-predictor-5754d77b4d-dxsfl"] Apr 17 08:15:18.862685 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:18.862651 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" path="/var/lib/kubelet/pods/ed31a636-b28e-41cc-80a5-c62b82f1c579/volumes" Apr 17 08:15:20.013487 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:20.013453 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 08:15:26.119160 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:26.119123 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:15:30.014185 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:30.014155 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" Apr 17 08:15:36.118667 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:36.118629 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:15:46.119265 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:46.119226 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:15:56.118570 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:15:56.118532 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 08:16:06.119658 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:16:06.119632 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" Apr 17 08:17:12.869648 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:17:12.869619 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:17:12.871824 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:17:12.871807 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:17:12.874103 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:17:12.874083 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:17:12.876125 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:17:12.876109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:22:12.890099 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:22:12.890064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:22:12.893647 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:22:12.893627 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:22:12.894757 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:22:12.894739 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:22:12.897938 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:22:12.897923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:24:03.433227 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.433137 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk"] Apr 17 08:24:03.433767 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.433439 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" containerID="cri-o://316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40" gracePeriod=30 Apr 17 08:24:03.500974 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.500914 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v"] Apr 17 08:24:03.501795 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.501703 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" Apr 17 08:24:03.501795 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.501727 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" Apr 17 08:24:03.501995 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.501821 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed31a636-b28e-41cc-80a5-c62b82f1c579" containerName="kserve-container" Apr 17 08:24:03.505299 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.505262 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" Apr 17 08:24:03.517098 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.516712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" Apr 17 08:24:03.518969 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.518942 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v"] Apr 17 08:24:03.639419 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.639367 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v"] Apr 17 08:24:03.641696 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:24:03.641662 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41082eb_419f_4b18_8bea_0f1939d92dca.slice/crio-dd9f31e934a6db85d7122c0ea0832edba991a6e924657d617c6300a25a87c103 WatchSource:0}: Error finding container dd9f31e934a6db85d7122c0ea0832edba991a6e924657d617c6300a25a87c103: Status 404 returned error can't find the container with id dd9f31e934a6db85d7122c0ea0832edba991a6e924657d617c6300a25a87c103 Apr 17 08:24:03.643634 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.643612 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:24:03.682125 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:03.682095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" event={"ID":"d41082eb-419f-4b18-8bea-0f1939d92dca","Type":"ContainerStarted","Data":"dd9f31e934a6db85d7122c0ea0832edba991a6e924657d617c6300a25a87c103"} Apr 17 08:24:04.686873 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:04.686832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" event={"ID":"d41082eb-419f-4b18-8bea-0f1939d92dca","Type":"ContainerStarted","Data":"7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1"} Apr 17 08:24:04.687352 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:04.686983 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" Apr 17 08:24:04.688157 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:04.688135 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 08:24:04.701449 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:04.701390 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podStartSLOduration=1.7013737899999999 podStartE2EDuration="1.70137379s" podCreationTimestamp="2026-04-17 08:24:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:24:04.701067377 +0000 UTC m=+1912.357930601" watchObservedRunningTime="2026-04-17 08:24:04.70137379 +0000 UTC m=+1912.358237004" Apr 17 08:24:05.689895 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:05.689857 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 08:24:06.572608 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.572586 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" Apr 17 08:24:06.694469 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.694400 2576 generic.go:358] "Generic (PLEG): container finished" podID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerID="316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40" exitCode=0 Apr 17 08:24:06.694469 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.694458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" event={"ID":"9db16dfd-98b0-46d1-a6a1-779cec048066","Type":"ContainerDied","Data":"316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40"} Apr 17 08:24:06.694894 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.694473 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" Apr 17 08:24:06.694894 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.694485 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk" event={"ID":"9db16dfd-98b0-46d1-a6a1-779cec048066","Type":"ContainerDied","Data":"87547324a2bee13555be846ec2cee6f2d92a927f5a5afca17fe20f796daa1251"} Apr 17 08:24:06.694894 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.694500 2576 scope.go:117] "RemoveContainer" containerID="316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40" Apr 17 08:24:06.702559 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.702544 2576 scope.go:117] "RemoveContainer" containerID="316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40" Apr 17 08:24:06.702799 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:24:06.702784 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40\": container with ID starting with 316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40 not found: ID does not exist" containerID="316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40" Apr 17 08:24:06.702845 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.702808 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40"} err="failed to get container status \"316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40\": rpc error: code = NotFound desc = could not find container \"316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40\": container with ID starting with 316e8d2086f7f13b3ff4101978c49df3650595b269916a91e952e6e7f25c5a40 not found: ID does not exist" Apr 17 08:24:06.714391 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.714354 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk"] Apr 17 08:24:06.716072 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.716053 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65d2e-predictor-56f76468b8-qqxkk"] Apr 17 08:24:06.863065 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:06.863040 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" path="/var/lib/kubelet/pods/9db16dfd-98b0-46d1-a6a1-779cec048066/volumes" Apr 17 08:24:15.690102 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:15.690055 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 08:24:25.690762 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:25.690715 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 08:24:35.690665 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:35.690618 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 08:24:39.283152 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.283118 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q"] Apr 17 08:24:39.283609 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.283447 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" containerID="cri-o://0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63" gracePeriod=30 Apr 17 08:24:39.312245 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.312216 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf"] Apr 17 08:24:39.312573 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.312559 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" Apr 17 08:24:39.312635 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.312575 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" Apr 17 08:24:39.312674 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.312638 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9db16dfd-98b0-46d1-a6a1-779cec048066" containerName="kserve-container" Apr 17 08:24:39.316200 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.316183 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" Apr 17 08:24:39.322215 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.322193 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf"] Apr 17 08:24:39.326660 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.326641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" Apr 17 08:24:39.466492 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.466455 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf"] Apr 17 08:24:39.472646 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:24:39.472603 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2bd1ce_a247_4c04_bdfc_4e20dbdcb812.slice/crio-87ba8e246b55655e75de718b6010c67b182b39bd38181aac87a922e22eac2abb WatchSource:0}: Error finding container 87ba8e246b55655e75de718b6010c67b182b39bd38181aac87a922e22eac2abb: Status 404 returned error can't find the container with id 87ba8e246b55655e75de718b6010c67b182b39bd38181aac87a922e22eac2abb Apr 17 08:24:39.791737 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.791707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" event={"ID":"bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812","Type":"ContainerStarted","Data":"d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7"} Apr 17 08:24:39.791899 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.791743 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" event={"ID":"bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812","Type":"ContainerStarted","Data":"87ba8e246b55655e75de718b6010c67b182b39bd38181aac87a922e22eac2abb"} Apr 17 08:24:39.792037 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.792004 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" Apr 17 08:24:39.793411 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.793381 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 08:24:39.808333 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:39.808263 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podStartSLOduration=0.808246772 podStartE2EDuration="808.246772ms" podCreationTimestamp="2026-04-17 08:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:24:39.806722361 +0000 UTC m=+1947.463585584" watchObservedRunningTime="2026-04-17 08:24:39.808246772 +0000 UTC m=+1947.465109984" Apr 17 08:24:40.794980 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:40.794944 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 08:24:42.325326 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.325299 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" Apr 17 08:24:42.803974 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.803938 2576 generic.go:358] "Generic (PLEG): container finished" podID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerID="0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63" exitCode=0 Apr 17 08:24:42.804158 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.804000 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" Apr 17 08:24:42.804158 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.804031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" event={"ID":"7f094e49-fecd-4e33-8e15-a761ff85682c","Type":"ContainerDied","Data":"0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63"} Apr 17 08:24:42.804158 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.804067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q" event={"ID":"7f094e49-fecd-4e33-8e15-a761ff85682c","Type":"ContainerDied","Data":"1bdca647168cd52869ce05914aa40ddf76446292bf5b91eb56d910106852c24c"} Apr 17 08:24:42.804158 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.804084 2576 scope.go:117] "RemoveContainer" containerID="0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63" Apr 17 08:24:42.812714 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.812695 2576 scope.go:117] "RemoveContainer" containerID="0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63" Apr 17 08:24:42.813019 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:24:42.812996 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63\": container with ID starting with 0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63 not found: ID does not exist" containerID="0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63" Apr 17 08:24:42.813066 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.813029 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63"} err="failed to get container status \"0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63\": rpc error: code = NotFound desc = could not find container \"0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63\": container with ID starting with 0292e424b6c59f12f1544f562272cb3deb1b73ceaae30dc23e83062074c1fa63 not found: ID does not exist" Apr 17 08:24:42.824957 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.824928 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q"] Apr 17 08:24:42.829038 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.829015 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-60721-predictor-56cc48fd48-qqh8q"] Apr 17 08:24:42.861848 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:42.861815 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" path="/var/lib/kubelet/pods/7f094e49-fecd-4e33-8e15-a761ff85682c/volumes" Apr 17 08:24:45.690141 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:45.690101 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 08:24:50.795483 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:50.795441 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 08:24:55.690504 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:24:55.690461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" Apr 17 08:25:00.795485 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:00.795391 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 08:25:10.795178 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:10.795136 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 08:25:20.795586 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:20.795538 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 08:25:23.764650 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.764617 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v"] Apr 17 08:25:23.765141 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.764874 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" containerID="cri-o://7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1" gracePeriod=30 Apr 17 08:25:23.767251 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.767226 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj"] Apr 17 08:25:23.767638 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.767623 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" Apr 17 08:25:23.767732 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.767642 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" Apr 17 08:25:23.767787 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.767742 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f094e49-fecd-4e33-8e15-a761ff85682c" containerName="kserve-container" Apr 17 08:25:23.771994 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.771975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" Apr 17 08:25:23.779455 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.779429 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj"] Apr 17 08:25:23.783021 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.783003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" Apr 17 08:25:23.928412 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:23.928386 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj"] Apr 17 08:25:23.930849 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:25:23.930821 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b7ce7b8_fccb_4f5a_a259_f908d3d26902.slice/crio-8677acd8e248ce9ad6c0336a5c11fd2eecdc787e8927586696cfaa37e20d6299 WatchSource:0}: Error finding container 8677acd8e248ce9ad6c0336a5c11fd2eecdc787e8927586696cfaa37e20d6299: Status 404 returned error can't find the container with id 8677acd8e248ce9ad6c0336a5c11fd2eecdc787e8927586696cfaa37e20d6299 Apr 17 08:25:24.932604 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:24.932559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" event={"ID":"7b7ce7b8-fccb-4f5a-a259-f908d3d26902","Type":"ContainerStarted","Data":"507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec"} Apr 17 08:25:24.932604 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:24.932607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" event={"ID":"7b7ce7b8-fccb-4f5a-a259-f908d3d26902","Type":"ContainerStarted","Data":"8677acd8e248ce9ad6c0336a5c11fd2eecdc787e8927586696cfaa37e20d6299"} Apr 17 08:25:24.933082 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:24.932738 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" Apr 17 08:25:24.934236 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:24.934206 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:25:24.946858 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:24.946808 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podStartSLOduration=1.946792002 podStartE2EDuration="1.946792002s" podCreationTimestamp="2026-04-17 08:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:25:24.945181712 +0000 UTC m=+1992.602044937" watchObservedRunningTime="2026-04-17 08:25:24.946792002 +0000 UTC m=+1992.603655216" Apr 17 08:25:25.690342 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:25.690301 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 17 08:25:25.935691 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:25.935648 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:25:27.212102 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.212077 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" Apr 17 08:25:27.941666 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.941632 2576 generic.go:358] "Generic (PLEG): container finished" podID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerID="7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1" exitCode=0 Apr 17 08:25:27.941844 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.941730 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" Apr 17 08:25:27.941844 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.941729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" event={"ID":"d41082eb-419f-4b18-8bea-0f1939d92dca","Type":"ContainerDied","Data":"7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1"} Apr 17 08:25:27.941929 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.941844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v" event={"ID":"d41082eb-419f-4b18-8bea-0f1939d92dca","Type":"ContainerDied","Data":"dd9f31e934a6db85d7122c0ea0832edba991a6e924657d617c6300a25a87c103"} Apr 17 08:25:27.941929 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.941862 2576 scope.go:117] "RemoveContainer" containerID="7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1" Apr 17 08:25:27.950464 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.950445 2576 scope.go:117] "RemoveContainer" containerID="7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1" Apr 17 08:25:27.950685 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:25:27.950666 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1\": container with ID starting with 7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1 not found: ID does not exist" containerID="7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1" Apr 17 08:25:27.950763 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.950692 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1"} err="failed to get container status \"7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1\": rpc error: code = NotFound desc = could not find container \"7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1\": container with ID starting with 7c4c05d10c8dd909323a8d8a8bb60cb7c9006ed8aa32872225565d559913b8c1 not found: ID does not exist" Apr 17 08:25:27.962449 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.962423 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v"] Apr 17 08:25:27.967537 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:27.967513 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-00a8c-predictor-5bc8467b76-n7f4v"] Apr 17 08:25:28.862181 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:28.862141 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" path="/var/lib/kubelet/pods/d41082eb-419f-4b18-8bea-0f1939d92dca/volumes" Apr 17 08:25:30.796127 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:30.796098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" Apr 17 08:25:35.936265 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:35.936221 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:25:45.936352 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:45.936307 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:25:55.936782 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:25:55.936738 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:26:05.936122 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:26:05.936058 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 17 08:26:15.936466 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:26:15.936435 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" Apr 17 08:27:12.912357 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:27:12.912326 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:27:12.916989 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:27:12.916961 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:27:12.917150 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:27:12.917097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:27:12.921382 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:27:12.921364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:32:12.934993 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:32:12.934881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:32:12.939702 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:32:12.939674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:32:12.939891 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:32:12.939871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:32:12.944321 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:32:12.944299 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:34:48.637536 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:48.637504 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj"] Apr 17 08:34:48.638050 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:48.637741 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" containerID="cri-o://507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec" gracePeriod=30 Apr 17 08:34:51.978987 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:51.978958 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" Apr 17 08:34:52.633906 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.633865 2576 generic.go:358] "Generic (PLEG): container finished" podID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerID="507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec" exitCode=0 Apr 17 08:34:52.634091 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.633929 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" Apr 17 08:34:52.634091 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.633941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" event={"ID":"7b7ce7b8-fccb-4f5a-a259-f908d3d26902","Type":"ContainerDied","Data":"507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec"} Apr 17 08:34:52.634091 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.633974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj" event={"ID":"7b7ce7b8-fccb-4f5a-a259-f908d3d26902","Type":"ContainerDied","Data":"8677acd8e248ce9ad6c0336a5c11fd2eecdc787e8927586696cfaa37e20d6299"} Apr 17 08:34:52.634091 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.633990 2576 scope.go:117] "RemoveContainer" containerID="507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec" Apr 17 08:34:52.642103 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.642084 2576 scope.go:117] "RemoveContainer" containerID="507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec" Apr 17 08:34:52.642351 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:34:52.642332 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec\": container with ID starting with 507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec not found: ID does not exist" containerID="507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec" Apr 17 08:34:52.642397 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.642360 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec"} err="failed to get container status \"507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec\": rpc error: code = NotFound desc = could not find container \"507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec\": container with ID starting with 507b7b69341d9c110c592b8114736edca833db9fc773eda031854d7196a3fdec not found: ID does not exist" Apr 17 08:34:52.653535 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.653508 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj"] Apr 17 08:34:52.655621 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.655600 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32e35-predictor-855c4946c6-dv4qj"] Apr 17 08:34:52.861584 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:34:52.861550 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" path="/var/lib/kubelet/pods/7b7ce7b8-fccb-4f5a-a259-f908d3d26902/volumes" Apr 17 08:37:12.957512 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:37:12.957401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:37:12.961990 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:37:12.961972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:37:12.962110 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:37:12.962004 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:37:12.966532 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:37:12.966515 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:42:08.823941 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:08.823907 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf"] Apr 17 08:42:08.824536 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:08.824257 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" containerID="cri-o://d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7" gracePeriod=30 Apr 17 08:42:10.249442 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.249406 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jlzjh/must-gather-qqmbx"] Apr 17 08:42:10.249931 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.249732 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" Apr 17 08:42:10.249931 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.249744 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" Apr 17 08:42:10.249931 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.249758 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" Apr 17 08:42:10.249931 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.249764 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" Apr 17 08:42:10.249931 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.249826 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d41082eb-419f-4b18-8bea-0f1939d92dca" containerName="kserve-container" Apr 17 08:42:10.249931 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.249838 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b7ce7b8-fccb-4f5a-a259-f908d3d26902" containerName="kserve-container" Apr 17 08:42:10.252991 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.252968 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.255309 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.255269 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jlzjh\"/\"openshift-service-ca.crt\"" Apr 17 08:42:10.255424 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.255373 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jlzjh\"/\"kube-root-ca.crt\"" Apr 17 08:42:10.256304 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.256262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jlzjh\"/\"default-dockercfg-2zjcj\"" Apr 17 08:42:10.265937 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.265913 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jlzjh/must-gather-qqmbx"] Apr 17 08:42:10.385988 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.385954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnpk\" (UniqueName: \"kubernetes.io/projected/4eacaa12-398e-473e-9a46-ccbfdc77eb41-kube-api-access-6lnpk\") pod \"must-gather-qqmbx\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.386145 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.385999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eacaa12-398e-473e-9a46-ccbfdc77eb41-must-gather-output\") pod \"must-gather-qqmbx\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.487069 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.487033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnpk\" (UniqueName: \"kubernetes.io/projected/4eacaa12-398e-473e-9a46-ccbfdc77eb41-kube-api-access-6lnpk\") pod \"must-gather-qqmbx\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.487216 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.487088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eacaa12-398e-473e-9a46-ccbfdc77eb41-must-gather-output\") pod \"must-gather-qqmbx\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.487505 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.487485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eacaa12-398e-473e-9a46-ccbfdc77eb41-must-gather-output\") pod \"must-gather-qqmbx\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.494889 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.494855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnpk\" (UniqueName: \"kubernetes.io/projected/4eacaa12-398e-473e-9a46-ccbfdc77eb41-kube-api-access-6lnpk\") pod \"must-gather-qqmbx\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.574809 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.574743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:10.691854 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.691830 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jlzjh/must-gather-qqmbx"] Apr 17 08:42:10.694198 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:42:10.694170 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eacaa12_398e_473e_9a46_ccbfdc77eb41.slice/crio-0fe01bf6dd00ed94233dc9716aad228c29289e953b01c4e587c4565030f41c52 WatchSource:0}: Error finding container 0fe01bf6dd00ed94233dc9716aad228c29289e953b01c4e587c4565030f41c52: Status 404 returned error can't find the container with id 0fe01bf6dd00ed94233dc9716aad228c29289e953b01c4e587c4565030f41c52 Apr 17 08:42:10.695780 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.695763 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:42:10.795358 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.795316 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 08:42:10.899408 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:10.899381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" event={"ID":"4eacaa12-398e-473e-9a46-ccbfdc77eb41","Type":"ContainerStarted","Data":"0fe01bf6dd00ed94233dc9716aad228c29289e953b01c4e587c4565030f41c52"} Apr 17 08:42:11.903742 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:11.903707 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerID="d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7" exitCode=0 Apr 17 08:42:11.904204 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:11.903773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" event={"ID":"bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812","Type":"ContainerDied","Data":"d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7"} Apr 17 08:42:12.186124 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:12.186103 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" Apr 17 08:42:12.911219 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:12.911178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" event={"ID":"bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812","Type":"ContainerDied","Data":"87ba8e246b55655e75de718b6010c67b182b39bd38181aac87a922e22eac2abb"} Apr 17 08:42:12.911664 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:12.911235 2576 scope.go:117] "RemoveContainer" containerID="d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7" Apr 17 08:42:12.911664 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:12.911201 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf" Apr 17 08:42:12.926550 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:12.926522 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf"] Apr 17 08:42:12.931591 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:12.931565 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54cf7-predictor-56f798986b-lsftf"] Apr 17 08:42:13.030262 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:13.030166 2576 scope.go:117] "RemoveContainer" containerID="d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7" Apr 17 08:42:14.787690 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:42:14.787650 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_kserve-container_success-200-isvc-54cf7-predictor-56f798986b-lsftf_kserve-ci-e2e-test_bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812_0 in pod sandbox 87ba8e246b55655e75de718b6010c67b182b39bd38181aac87a922e22eac2abb: identifier is not a container" containerID="d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7" Apr 17 08:42:14.788070 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:14.787704 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29f2ed1f0d65f66374e70a1233f961c67b90e8ff527c737c9b9235a26c3dab7"} err="rpc error: code = Unknown desc = failed to delete container k8s_kserve-container_success-200-isvc-54cf7-predictor-56f798986b-lsftf_kserve-ci-e2e-test_bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812_0 in pod sandbox 87ba8e246b55655e75de718b6010c67b182b39bd38181aac87a922e22eac2abb: identifier is not a container" Apr 17 08:42:14.862892 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:14.862862 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" path="/var/lib/kubelet/pods/bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812/volumes" Apr 17 08:42:15.016070 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:15.016040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:42:15.016337 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:15.016139 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:42:15.020659 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:15.020640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:42:15.020931 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:15.020919 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:42:15.923810 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:15.923770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" event={"ID":"4eacaa12-398e-473e-9a46-ccbfdc77eb41","Type":"ContainerStarted","Data":"d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad"} Apr 17 08:42:15.924206 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:15.923814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" event={"ID":"4eacaa12-398e-473e-9a46-ccbfdc77eb41","Type":"ContainerStarted","Data":"99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f"} Apr 17 08:42:15.939853 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:15.939801 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" podStartSLOduration=1.5722617639999998 podStartE2EDuration="5.939786578s" podCreationTimestamp="2026-04-17 08:42:10 +0000 UTC" firstStartedPulling="2026-04-17 08:42:10.695881724 +0000 UTC m=+2998.352744915" lastFinishedPulling="2026-04-17 08:42:15.063406538 +0000 UTC m=+3002.720269729" observedRunningTime="2026-04-17 08:42:15.937675377 +0000 UTC m=+3003.594538601" watchObservedRunningTime="2026-04-17 08:42:15.939786578 +0000 UTC m=+3003.596649788" Apr 17 08:42:33.981408 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:33.981374 2576 generic.go:358] "Generic (PLEG): container finished" podID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerID="99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f" exitCode=0 Apr 17 08:42:33.981808 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:33.981446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" event={"ID":"4eacaa12-398e-473e-9a46-ccbfdc77eb41","Type":"ContainerDied","Data":"99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f"} Apr 17 08:42:33.981808 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:33.981764 2576 scope.go:117] "RemoveContainer" containerID="99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f" Apr 17 08:42:34.226353 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:34.226324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jlzjh_must-gather-qqmbx_4eacaa12-398e-473e-9a46-ccbfdc77eb41/gather/0.log" Apr 17 08:42:37.437882 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:37.437849 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pmpqp_85a06191-13b3-4742-8598-8d7237fae7f3/global-pull-secret-syncer/0.log" Apr 17 08:42:37.626403 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:37.626372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jxxvl_502b7429-f580-4079-ad03-6c6b86f1903f/konnectivity-agent/0.log" Apr 17 08:42:37.654572 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:37.654545 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-165.ec2.internal_9a551efa739e7b700ba0ce69f0532dd7/haproxy/0.log" Apr 17 08:42:39.625084 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.625056 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jlzjh/must-gather-qqmbx"] Apr 17 08:42:39.625637 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.625290 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerName="copy" containerID="cri-o://d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad" gracePeriod=2 Apr 17 08:42:39.627239 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.627209 2576 status_manager.go:895] "Failed to get status for pod" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" err="pods \"must-gather-qqmbx\" is forbidden: User \"system:node:ip-10-0-137-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jlzjh\": no relationship found between node 'ip-10-0-137-165.ec2.internal' and this object" Apr 17 08:42:39.628258 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.628236 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jlzjh/must-gather-qqmbx"] Apr 17 08:42:39.848187 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.848165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jlzjh_must-gather-qqmbx_4eacaa12-398e-473e-9a46-ccbfdc77eb41/copy/0.log" Apr 17 08:42:39.848574 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.848558 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:39.850574 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.850551 2576 status_manager.go:895] "Failed to get status for pod" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" err="pods \"must-gather-qqmbx\" is forbidden: User \"system:node:ip-10-0-137-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jlzjh\": no relationship found between node 'ip-10-0-137-165.ec2.internal' and this object" Apr 17 08:42:39.935883 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.935818 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lnpk\" (UniqueName: \"kubernetes.io/projected/4eacaa12-398e-473e-9a46-ccbfdc77eb41-kube-api-access-6lnpk\") pod \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " Apr 17 08:42:39.935883 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.935850 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eacaa12-398e-473e-9a46-ccbfdc77eb41-must-gather-output\") pod \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\" (UID: \"4eacaa12-398e-473e-9a46-ccbfdc77eb41\") " Apr 17 08:42:39.937384 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.937361 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eacaa12-398e-473e-9a46-ccbfdc77eb41-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4eacaa12-398e-473e-9a46-ccbfdc77eb41" (UID: "4eacaa12-398e-473e-9a46-ccbfdc77eb41"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:42:39.937921 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.937900 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eacaa12-398e-473e-9a46-ccbfdc77eb41-kube-api-access-6lnpk" (OuterVolumeSpecName: "kube-api-access-6lnpk") pod "4eacaa12-398e-473e-9a46-ccbfdc77eb41" (UID: "4eacaa12-398e-473e-9a46-ccbfdc77eb41"). InnerVolumeSpecName "kube-api-access-6lnpk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:42:39.998940 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.998918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jlzjh_must-gather-qqmbx_4eacaa12-398e-473e-9a46-ccbfdc77eb41/copy/0.log" Apr 17 08:42:39.999235 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.999213 2576 generic.go:358] "Generic (PLEG): container finished" podID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerID="d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad" exitCode=143 Apr 17 08:42:39.999347 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.999265 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" Apr 17 08:42:39.999347 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:39.999307 2576 scope.go:117] "RemoveContainer" containerID="d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad" Apr 17 08:42:40.001390 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.001368 2576 status_manager.go:895] "Failed to get status for pod" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" err="pods \"must-gather-qqmbx\" is forbidden: User \"system:node:ip-10-0-137-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jlzjh\": no relationship found between node 'ip-10-0-137-165.ec2.internal' and this object" Apr 17 08:42:40.007415 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.007397 2576 scope.go:117] "RemoveContainer" containerID="99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f" Apr 17 08:42:40.010560 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.010535 2576 status_manager.go:895] "Failed to get status for pod" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" pod="openshift-must-gather-jlzjh/must-gather-qqmbx" err="pods \"must-gather-qqmbx\" is forbidden: User \"system:node:ip-10-0-137-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jlzjh\": no relationship found between node 'ip-10-0-137-165.ec2.internal' and this object" Apr 17 08:42:40.019593 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.019573 2576 scope.go:117] "RemoveContainer" containerID="d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad" Apr 17 08:42:40.019827 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:42:40.019808 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad\": container with ID starting with d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad not found: ID does not exist" containerID="d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad" Apr 17 08:42:40.019909 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.019833 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad"} err="failed to get container status \"d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad\": rpc error: code = NotFound desc = could not find container \"d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad\": container with ID starting with d899897914e4c6bb735d6f944a7c1d60686212dc21b7ed78a936024da094b6ad not found: ID does not exist" Apr 17 08:42:40.019909 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.019851 2576 scope.go:117] "RemoveContainer" containerID="99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f" Apr 17 08:42:40.020097 ip-10-0-137-165 kubenswrapper[2576]: E0417 08:42:40.020078 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f\": container with ID starting with 99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f not found: ID does not exist" containerID="99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f" Apr 17 08:42:40.020144 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.020102 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f"} err="failed to get container status \"99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f\": rpc error: code = NotFound desc = could not find container \"99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f\": container with ID starting with 99611787665c5b2fa12412527f7d0897e1ad70b2e9a04ed1ed9de306288a951f not found: ID does not exist" Apr 17 08:42:40.036540 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.036509 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6lnpk\" (UniqueName: \"kubernetes.io/projected/4eacaa12-398e-473e-9a46-ccbfdc77eb41-kube-api-access-6lnpk\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 08:42:40.036540 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.036539 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4eacaa12-398e-473e-9a46-ccbfdc77eb41-must-gather-output\") on node \"ip-10-0-137-165.ec2.internal\" DevicePath \"\"" Apr 17 08:42:40.861712 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:40.861681 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" path="/var/lib/kubelet/pods/4eacaa12-398e-473e-9a46-ccbfdc77eb41/volumes" Apr 17 08:42:41.027456 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.027426 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-xt24b_b39b6955-dc2b-4aa2-9bc9-eb1ea55b33be/cluster-monitoring-operator/0.log" Apr 17 08:42:41.417782 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.417750 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h7xpg_5c6eb6cc-f216-4722-8e3d-297a548b8b5d/node-exporter/0.log" Apr 17 08:42:41.452479 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.452456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h7xpg_5c6eb6cc-f216-4722-8e3d-297a548b8b5d/kube-rbac-proxy/0.log" Apr 17 08:42:41.493189 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.493166 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h7xpg_5c6eb6cc-f216-4722-8e3d-297a548b8b5d/init-textfile/0.log" Apr 17 08:42:41.538452 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.538376 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-59scv_f077267b-3589-45a9-a1ef-ad4d07a595cf/kube-rbac-proxy-main/0.log" Apr 17 08:42:41.568369 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.568346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-59scv_f077267b-3589-45a9-a1ef-ad4d07a595cf/kube-rbac-proxy-self/0.log" Apr 17 08:42:41.605566 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.605543 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-59scv_f077267b-3589-45a9-a1ef-ad4d07a595cf/openshift-state-metrics/0.log" Apr 17 08:42:41.664583 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.664560 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_38ce247e-0fc4-4704-abe7-7e265a873052/prometheus/0.log" Apr 17 08:42:41.720394 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.720365 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_38ce247e-0fc4-4704-abe7-7e265a873052/config-reloader/0.log" Apr 17 08:42:41.760732 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.760701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_38ce247e-0fc4-4704-abe7-7e265a873052/thanos-sidecar/0.log" Apr 17 08:42:41.813499 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.813422 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_38ce247e-0fc4-4704-abe7-7e265a873052/kube-rbac-proxy-web/0.log" Apr 17 08:42:41.850347 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.850319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_38ce247e-0fc4-4704-abe7-7e265a873052/kube-rbac-proxy/0.log" Apr 17 08:42:41.888434 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.888408 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_38ce247e-0fc4-4704-abe7-7e265a873052/kube-rbac-proxy-thanos/0.log" Apr 17 08:42:41.929315 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.929269 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_38ce247e-0fc4-4704-abe7-7e265a873052/init-config-reloader/0.log" Apr 17 08:42:41.974727 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:41.974697 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jmncg_17ff8df6-8cf5-42bd-923b-68b156ef4cf5/prometheus-operator/0.log" Apr 17 08:42:42.009331 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:42.009301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jmncg_17ff8df6-8cf5-42bd-923b-68b156ef4cf5/kube-rbac-proxy/0.log" Apr 17 08:42:42.050389 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:42.050362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-8nzhl_52412a8f-a644-49c4-9da0-e872be529692/prometheus-operator-admission-webhook/0.log" Apr 17 08:42:43.409047 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:43.409016 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-529rs_7a34eada-e251-4bc7-8937-8f933c0cbd6f/networking-console-plugin/0.log" Apr 17 08:42:43.790536 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:43.790448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/2.log" Apr 17 08:42:43.797952 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:43.797930 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zrcx8_5f854e39-db34-4c83-8d33-c1d1898b7133/console-operator/3.log" Apr 17 08:42:44.165148 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.165116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-6594d_3779ef4e-6fc6-4738-bab0-223e27cbd53b/download-server/0.log" Apr 17 08:42:44.546139 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546059 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8"] Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546386 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerName="gather" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546397 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerName="gather" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546417 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerName="copy" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546423 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerName="copy" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546432 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546437 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546484 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf2bd1ce-a247-4c04-bdfc-4e20dbdcb812" containerName="kserve-container" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546494 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerName="gather" Apr 17 08:42:44.546519 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.546502 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4eacaa12-398e-473e-9a46-ccbfdc77eb41" containerName="copy" Apr 17 08:42:44.550387 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.550367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.552607 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.552587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gxzkb\"/\"default-dockercfg-pdhdf\"" Apr 17 08:42:44.553494 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.553476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gxzkb\"/\"openshift-service-ca.crt\"" Apr 17 08:42:44.553575 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.553557 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gxzkb\"/\"kube-root-ca.crt\"" Apr 17 08:42:44.556369 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.556351 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8"] Apr 17 08:42:44.678837 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.678804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-podres\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.679005 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.678846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-sys\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.679005 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.678864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-proc\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.679005 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.678908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dt5p\" (UniqueName: \"kubernetes.io/projected/3ad75c86-3d63-4231-9425-fd755c5ae9c3-kube-api-access-2dt5p\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.679005 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.678972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-lib-modules\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780387 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-lib-modules\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780550 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-podres\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780550 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-sys\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780550 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-proc\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780550 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dt5p\" (UniqueName: \"kubernetes.io/projected/3ad75c86-3d63-4231-9425-fd755c5ae9c3-kube-api-access-2dt5p\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780711 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-lib-modules\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780711 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-podres\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780711 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-proc\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.780711 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.780563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ad75c86-3d63-4231-9425-fd755c5ae9c3-sys\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.788103 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.788083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dt5p\" (UniqueName: \"kubernetes.io/projected/3ad75c86-3d63-4231-9425-fd755c5ae9c3-kube-api-access-2dt5p\") pod \"perf-node-gather-daemonset-2fwk8\" (UID: \"3ad75c86-3d63-4231-9425-fd755c5ae9c3\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.861687 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.861662 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:44.975072 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:44.975038 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8"] Apr 17 08:42:44.978177 ip-10-0-137-165 kubenswrapper[2576]: W0417 08:42:44.978145 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3ad75c86_3d63_4231_9425_fd755c5ae9c3.slice/crio-14bbe3f4ede01c7dd19a000bb9fe8ba5c42f732150c3caaa998cdd7a9b068bb4 WatchSource:0}: Error finding container 14bbe3f4ede01c7dd19a000bb9fe8ba5c42f732150c3caaa998cdd7a9b068bb4: Status 404 returned error can't find the container with id 14bbe3f4ede01c7dd19a000bb9fe8ba5c42f732150c3caaa998cdd7a9b068bb4 Apr 17 08:42:45.016521 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:45.016496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" event={"ID":"3ad75c86-3d63-4231-9425-fd755c5ae9c3","Type":"ContainerStarted","Data":"14bbe3f4ede01c7dd19a000bb9fe8ba5c42f732150c3caaa998cdd7a9b068bb4"} Apr 17 08:42:45.237034 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:45.236950 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6f2m8_b1a369df-257c-47a4-96da-3025f897b1dd/dns/0.log" Apr 17 08:42:45.255110 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:45.255079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6f2m8_b1a369df-257c-47a4-96da-3025f897b1dd/kube-rbac-proxy/0.log" Apr 17 08:42:45.337250 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:45.337219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xvmfc_5cf29621-68bf-43a5-94a8-643b390fca92/dns-node-resolver/0.log" Apr 17 08:42:45.809176 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:45.809147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ssttk_5a79228b-b4cc-4d96-b9a2-a587214f9a0d/node-ca/0.log" Apr 17 08:42:46.021193 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:46.021154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" event={"ID":"3ad75c86-3d63-4231-9425-fd755c5ae9c3","Type":"ContainerStarted","Data":"2f3dce1eadd87ade4bd154a5042d4738b1672e5bd1323d55874e9f7c5d212b60"} Apr 17 08:42:46.021389 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:46.021297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:46.037319 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:46.037256 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" podStartSLOduration=2.037239798 podStartE2EDuration="2.037239798s" podCreationTimestamp="2026-04-17 08:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:42:46.036339499 +0000 UTC m=+3033.693202713" watchObservedRunningTime="2026-04-17 08:42:46.037239798 +0000 UTC m=+3033.694103012" Apr 17 08:42:46.467966 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:46.467940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bcb9d645d-ktdgl_999368dd-92dc-4226-a15b-73baf3d1e08a/router/0.log" Apr 17 08:42:46.779454 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:46.779382 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9w6bd_8ff81f11-f2e2-4838-a775-e57edc28571c/serve-healthcheck-canary/0.log" Apr 17 08:42:47.171269 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:47.171238 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-bvdcl_58646635-9ae5-4468-b026-e2e262f7810c/insights-operator/0.log" Apr 17 08:42:47.171837 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:47.171819 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-bvdcl_58646635-9ae5-4468-b026-e2e262f7810c/insights-operator/1.log" Apr 17 08:42:47.256429 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:47.256400 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56dl7_985c2da4-24ca-41d7-8de7-5cf5760588db/kube-rbac-proxy/0.log" Apr 17 08:42:47.275645 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:47.275619 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56dl7_985c2da4-24ca-41d7-8de7-5cf5760588db/exporter/0.log" Apr 17 08:42:47.295288 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:47.295247 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56dl7_985c2da4-24ca-41d7-8de7-5cf5760588db/extractor/0.log" Apr 17 08:42:49.522714 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:49.522685 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-m77jz_83398091-9227-415c-b435-dca3c86f22ff/s3-init/0.log" Apr 17 08:42:52.034663 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:52.034633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-2fwk8" Apr 17 08:42:53.522251 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:53.522209 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fn8dn_ef9177b8-0879-4607-8085-b87914bfa611/kube-storage-version-migrator-operator/1.log" Apr 17 08:42:53.524060 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:53.524036 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fn8dn_ef9177b8-0879-4607-8085-b87914bfa611/kube-storage-version-migrator-operator/0.log" Apr 17 08:42:54.576123 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:54.576097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lt4x_70917b3b-92d9-4406-9795-92a9f4be21ea/kube-multus/0.log" Apr 17 08:42:54.894457 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:54.894427 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4n4s_28fe626d-8e66-482b-b03e-847bb5829a0b/kube-multus-additional-cni-plugins/0.log" Apr 17 08:42:54.914029 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:54.914007 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4n4s_28fe626d-8e66-482b-b03e-847bb5829a0b/egress-router-binary-copy/0.log" Apr 17 08:42:54.933217 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:54.933197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4n4s_28fe626d-8e66-482b-b03e-847bb5829a0b/cni-plugins/0.log" Apr 17 08:42:54.954265 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:54.954240 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4n4s_28fe626d-8e66-482b-b03e-847bb5829a0b/bond-cni-plugin/0.log" Apr 17 08:42:54.975229 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:54.975205 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4n4s_28fe626d-8e66-482b-b03e-847bb5829a0b/routeoverride-cni/0.log" Apr 17 08:42:54.994740 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:54.994717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4n4s_28fe626d-8e66-482b-b03e-847bb5829a0b/whereabouts-cni-bincopy/0.log" Apr 17 08:42:55.014292 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:55.014255 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4n4s_28fe626d-8e66-482b-b03e-847bb5829a0b/whereabouts-cni/0.log" Apr 17 08:42:55.151018 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:55.150935 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wxvbl_0cd46437-1e4d-4927-88fe-3d5f18ee621d/network-metrics-daemon/0.log" Apr 17 08:42:55.173612 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:55.173587 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wxvbl_0cd46437-1e4d-4927-88fe-3d5f18ee621d/kube-rbac-proxy/0.log" Apr 17 08:42:55.929123 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:55.929078 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-controller/0.log" Apr 17 08:42:55.945547 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:55.945519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/0.log" Apr 17 08:42:55.971876 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:55.971846 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovn-acl-logging/1.log" Apr 17 08:42:55.994413 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:55.994336 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/kube-rbac-proxy-node/0.log" Apr 17 08:42:56.016083 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:56.016053 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:42:56.034535 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:56.034505 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/northd/0.log" Apr 17 08:42:56.054620 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:56.054600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/nbdb/0.log" Apr 17 08:42:56.075237 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:56.075211 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/sbdb/0.log" Apr 17 08:42:56.246139 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:56.246052 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h7zs8_d0fdfd60-0abc-4f38-bff8-7936432cb97b/ovnkube-controller/0.log" Apr 17 08:42:57.781644 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:57.781611 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rw8ct_b17b42fe-4930-48b1-ac74-5439d9fc893c/network-check-target-container/0.log" Apr 17 08:42:58.642144 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:58.642121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fkhld_d7cf70ff-ceb3-4797-94a2-b29fbacd8f78/iptables-alerter/0.log" Apr 17 08:42:59.308308 ip-10-0-137-165 kubenswrapper[2576]: I0417 08:42:59.308261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-cpkz9_e9a8b24f-6854-4d01-95e6-4bc5d1edd592/tuned/0.log"