Apr 23 08:48:08.357687 ip-10-0-139-48 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:48:08.815216 ip-10-0-139-48 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:48:08.815216 ip-10-0-139-48 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:48:08.815216 ip-10-0-139-48 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:48:08.815216 ip-10-0-139-48 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:48:08.815216 ip-10-0-139-48 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:48:08.818122 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.818028 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:48:08.821072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821057 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821073 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821077 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821080 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821083 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821086 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821089 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821092 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821094 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821098 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821101 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821103 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821106 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821108 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821111 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821114 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821116 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821119 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821121 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:08.821114 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821124 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821127 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821130 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821133 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821135 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821138 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821140 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821143 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821146 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821148 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821150 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821153 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821155 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821158 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821160 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821163 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821166 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821170 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821172 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821175 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821177 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:08.821580 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821179 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821182 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821184 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821187 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821189 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821192 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821194 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821196 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821199 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821201 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821203 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821206 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821209 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821212 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821215 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821217 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821220 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821222 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821225 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:08.822122 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821228 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821230 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821232 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821235 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821237 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821240 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821243 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821245 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821247 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821250 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821253 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821255 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821261 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821264 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821267 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821270 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821273 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821275 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821278 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821280 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:08.822602 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821282 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821285 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821287 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821291 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821295 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821298 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821301 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821708 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821713 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821716 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821719 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821721 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821724 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821726 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821729 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821731 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821734 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821737 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821739 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:08.823079 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821742 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821745 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821750 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821753 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821756 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821758 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821761 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821763 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821766 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821768 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821771 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821773 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821776 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821778 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821781 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821783 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821786 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821788 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821791 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821794 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:08.823567 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821796 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821799 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821801 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821804 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821806 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821809 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821812 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821816 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821818 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821821 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821823 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821825 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821828 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821830 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821833 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821835 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821838 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821840 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821842 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:08.824069 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821845 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821848 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821850 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821853 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821855 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821857 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821860 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821862 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821864 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821867 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821869 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821872 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821875 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821879 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821883 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821886 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821888 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821891 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821893 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:08.824551 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821895 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821898 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821901 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821904 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821907 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821909 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821911 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821914 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821916 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821918 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821921 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821924 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821927 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821929 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821932 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.821934 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822007 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822014 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822020 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822025 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822029 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:48:08.825052 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822032 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822036 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822041 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822045 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822049 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822052 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822056 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822059 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822062 2574 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822064 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822067 2574 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822070 2574 flags.go:64] FLAG: --cloud-config="" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822072 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822075 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822079 2574 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822082 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822085 2574 flags.go:64] FLAG: --config-dir="" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822088 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822091 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822096 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822099 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822102 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822106 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822109 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:48:08.825573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822112 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822114 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822118 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822120 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822125 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822128 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822131 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822133 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822137 2574 flags.go:64] FLAG: --enable-server="true" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822139 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822144 2574 flags.go:64] FLAG: --event-burst="100" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822148 2574 flags.go:64] FLAG: --event-qps="50" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822151 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822154 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822157 2574 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822161 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822163 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822166 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822170 2574 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822172 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822175 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822178 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822181 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822184 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822186 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:48:08.826150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822189 2574 flags.go:64] FLAG: --feature-gates="" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822193 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822196 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822199 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822202 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822205 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822208 2574 flags.go:64] FLAG: --help="false" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822211 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-139-48.ec2.internal" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822214 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822217 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822219 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822222 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822226 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822229 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822231 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822234 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822237 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822239 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822242 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822246 2574 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822249 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822252 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822254 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822257 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:48:08.826806 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822260 2574 flags.go:64] FLAG: --lock-file="" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822262 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822265 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822268 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822273 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822276 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822278 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822281 2574 flags.go:64] FLAG: --logging-format="text" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822284 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822287 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822289 2574 flags.go:64] FLAG: --manifest-url="" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822292 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822297 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822300 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822304 2574 flags.go:64] FLAG: --max-pods="110" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822307 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822310 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822313 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822316 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822319 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822322 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822325 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822331 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822334 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822337 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:48:08.827399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822340 2574 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822343 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822348 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822352 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822356 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822359 2574 flags.go:64] FLAG: --port="10250" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822362 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822364 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f7c8ccdf08b3bb59" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822367 2574 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822370 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822373 2574 flags.go:64] FLAG: --register-node="true" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822376 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822379 2574 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822382 2574 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822399 2574 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822402 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822405 2574 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822409 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822412 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822415 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822419 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822422 2574 flags.go:64] FLAG: --runonce="false" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822425 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822428 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822431 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:48:08.827991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822434 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822438 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822441 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822444 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822447 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822450 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822452 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822455 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822458 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822461 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822464 2574 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822467 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822472 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822475 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822477 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822481 2574 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822484 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822487 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822489 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822492 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822495 2574 flags.go:64] FLAG: --v="2" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822500 2574 flags.go:64] FLAG: --version="false" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822504 2574 flags.go:64] FLAG: --vmodule="" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822509 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.822512 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:48:08.828612 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822601 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822605 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822608 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822611 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822613 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822616 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822619 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822623 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822626 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822628 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822630 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822633 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822636 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822639 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822641 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822643 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822646 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822648 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822651 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822653 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:08.829261 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822656 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822658 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822661 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822663 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822665 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822668 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822670 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822673 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822675 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822678 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822684 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822687 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822690 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822693 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822698 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822700 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822704 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822706 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822709 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:08.830100 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822712 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822715 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822718 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822721 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822723 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822725 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822728 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822730 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822733 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822737 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822740 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822743 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822746 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822748 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822751 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822753 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822756 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822758 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822761 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:08.830595 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822763 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822766 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822769 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822771 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822775 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822778 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822780 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822783 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822786 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822789 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822791 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822794 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822797 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822799 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822802 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822805 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822807 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822810 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822812 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822815 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822817 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:08.831055 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822820 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822822 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822825 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822827 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822830 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822832 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.822835 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.823543 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.831349 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.831486 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831538 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831543 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831546 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831549 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831552 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:08.831709 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831556 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831560 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831563 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831565 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831570 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831572 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831575 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831577 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831580 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831583 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831585 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831588 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831591 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831594 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831596 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831599 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831602 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831604 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831607 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831610 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:08.832083 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831613 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831616 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831619 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831622 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831624 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831627 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831630 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831633 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831636 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831638 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831641 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831643 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831646 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831648 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831651 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831654 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831656 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831659 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831661 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831663 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831666 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:08.832601 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831668 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831671 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831673 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831676 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831679 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831682 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831685 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831687 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831690 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831692 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831694 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831697 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831699 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831702 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831705 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831707 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831710 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831713 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831716 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:08.833109 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831718 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831721 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831724 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831727 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831729 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831731 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831734 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831737 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831739 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831742 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831745 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831749 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831751 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831754 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831756 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831759 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831763 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831767 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831770 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:08.833676 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831773 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831775 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.831781 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831877 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831882 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831885 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831888 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831891 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831894 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831897 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831900 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831903 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831906 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831909 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831912 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:08.834159 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831916 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831918 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831921 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831923 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831926 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831928 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831931 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831933 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831936 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831938 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831942 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831945 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831947 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831950 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831952 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831955 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831959 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831962 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831965 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831968 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:08.834546 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831971 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831973 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831976 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831979 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831982 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831985 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831988 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831990 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831993 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831995 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.831998 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832000 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832002 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832005 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832007 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832010 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832012 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832014 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832017 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832019 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:08.835076 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832021 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832024 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832026 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832030 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832032 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832035 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832037 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832040 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832042 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832044 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832046 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832049 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832051 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832054 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832056 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832058 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832061 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832063 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832066 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832069 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:08.835583 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832071 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832074 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832076 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832079 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832081 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832083 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832086 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832089 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832091 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832094 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832096 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832099 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832101 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:08.832104 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.832108 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:48:08.836072 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.832217 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:48:08.836448 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.835524 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:48:08.836685 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.836673 2574 server.go:1019] "Starting client certificate rotation" Apr 23 08:48:08.836791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.836777 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:48:08.836822 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.836814 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:48:08.864138 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.864119 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:48:08.868781 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.868759 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:48:08.884064 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.884042 2574 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:48:08.889781 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.889766 2574 log.go:25] "Validated CRI v1 image API" Apr 23 08:48:08.891041 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.891025 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:48:08.891940 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.891925 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:48:08.896253 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.896230 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c6e005db-07f7-4dd8-8b87-63bb8a495aa6:/dev/nvme0n1p4 eff12348-eb32-4e53-af47-73f3bc9d93d0:/dev/nvme0n1p3] Apr 23 08:48:08.896320 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.896255 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:48:08.902937 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.902831 2574 manager.go:217] Machine: {Timestamp:2026-04-23 08:48:08.900893613 +0000 UTC m=+0.417752942 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099766 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21d1ff6dd1088608d18bee2508b7ce SystemUUID:ec21d1ff-6dd1-0886-08d1-8bee2508b7ce BootID:62d2fe97-1337-43ce-a880-cba9484c4120 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:13:08:e2:82:7d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:13:08:e2:82:7d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:c7:fd:ed:6e:d8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:48:08.903377 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.903365 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:48:08.903479 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.903466 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:48:08.904654 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.904632 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:48:08.904790 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.904656 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-48.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:48:08.904839 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.904799 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:48:08.904839 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.904809 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:48:08.904839 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.904822 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:48:08.905634 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.905624 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:48:08.906476 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.906467 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:48:08.906575 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.906566 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:48:08.909243 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.909234 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:48:08.909280 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.909247 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:48:08.909280 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.909258 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:48:08.909280 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.909267 2574 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:48:08.909280 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.909276 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:48:08.910485 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.910473 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:48:08.910534 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.910492 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:48:08.912758 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.912740 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g9dbj" Apr 23 08:48:08.913473 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.913452 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:48:08.914874 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.914860 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:48:08.916762 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916750 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:48:08.916799 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916767 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:48:08.916799 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916773 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:48:08.916799 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916784 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:48:08.916799 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916790 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:48:08.916799 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916796 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:48:08.916932 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916801 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:48:08.916932 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916807 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:48:08.916932 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916815 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:48:08.916932 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916820 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:48:08.916932 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916828 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:48:08.916932 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.916837 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:48:08.917680 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.917669 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:48:08.917710 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.917680 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:48:08.920590 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.920561 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g9dbj" Apr 23 08:48:08.920590 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.920564 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:48:08.920724 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.920591 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:48:08.921504 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.921492 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:48:08.921536 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.921531 2574 server.go:1295] "Started kubelet" Apr 23 08:48:08.921673 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.921619 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:48:08.921752 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.921627 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:48:08.921752 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.921732 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:48:08.922450 ip-10-0-139-48 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:48:08.923588 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.923575 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:48:08.923775 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.923763 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:48:08.928510 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.928480 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:48:08.929133 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.929114 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:48:08.930107 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.930089 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:48:08.931708 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.930099 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:48:08.931794 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.931712 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:48:08.931876 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.931855 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:48:08.931876 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.931863 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:48:08.932317 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932298 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:08.932536 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.932515 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:08.932612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932585 2574 factory.go:153] Registering CRI-O factory Apr 23 08:48:08.932612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932598 2574 factory.go:223] Registration of the crio container factory successfully Apr 23 08:48:08.932692 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932676 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:48:08.932692 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932686 2574 factory.go:55] Registering systemd factory Apr 23 08:48:08.932776 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932693 2574 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:48:08.932776 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932723 2574 factory.go:103] Registering Raw factory Apr 23 08:48:08.932776 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.932736 2574 manager.go:1196] Started watching for new ooms in manager Apr 23 08:48:08.933030 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.932982 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:48:08.933732 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.933715 2574 manager.go:319] Starting recovery of all containers Apr 23 08:48:08.934740 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.934720 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-48.ec2.internal\" not found" node="ip-10-0-139-48.ec2.internal" Apr 23 08:48:08.935062 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.935047 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-48.ec2.internal" not found Apr 23 08:48:08.939472 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.939144 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:48:08.945701 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.945688 2574 manager.go:324] Recovery completed Apr 23 08:48:08.949465 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.949450 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-48.ec2.internal" not found Apr 23 08:48:08.949519 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.949508 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:08.952840 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.952827 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:08.952893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.952856 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:08.952893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.952869 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:08.953361 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.953348 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:48:08.953426 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.953361 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:48:08.953426 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.953403 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:48:08.955914 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.955897 2574 policy_none.go:49] "None policy: Start" Apr 23 08:48:08.955914 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.955913 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:48:08.956049 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.955923 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:48:08.992926 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.992910 2574 manager.go:341] "Starting Device Plugin manager" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.992965 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.992975 2574 server.go:85] "Starting device plugin registration server" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.993191 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.993204 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.993273 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.993351 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:08.993359 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.993987 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:08.994029 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.004779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.004239 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-48.ec2.internal" not found Apr 23 08:48:09.065029 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.065002 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:48:09.065133 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.065036 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:48:09.065133 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.065055 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:48:09.065133 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.065062 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:48:09.065293 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.065133 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:48:09.068582 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.068565 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:09.094065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.094036 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:09.095462 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.095447 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:09.095542 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.095474 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:09.095542 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.095485 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:09.095542 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.095509 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.104012 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.103994 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.104064 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.104019 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-48.ec2.internal\": node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.121537 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.121514 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.166057 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.166032 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal"] Apr 23 08:48:09.166127 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.166097 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:09.166917 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.166901 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:09.166993 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.166935 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:09.166993 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.166951 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:09.168174 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.168158 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:09.168330 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.168317 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.168373 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.168346 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:09.169179 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.169160 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:09.169257 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.169166 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:09.169257 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.169209 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:09.169257 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.169219 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:09.169257 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.169194 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:09.169401 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.169275 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:09.170502 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.170486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.170565 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.170510 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:09.171146 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.171130 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:09.171221 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.171153 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:09.171221 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.171166 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:09.205074 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.205055 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-48.ec2.internal\" not found" node="ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.209608 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.209593 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-48.ec2.internal\" not found" node="ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.222593 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.222575 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.232993 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.232975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0f0fea8f3c0603deedbbe0331afef6c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal\" (UID: \"e0f0fea8f3c0603deedbbe0331afef6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.233053 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.233000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0f0fea8f3c0603deedbbe0331afef6c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal\" (UID: \"e0f0fea8f3c0603deedbbe0331afef6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.233053 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.233017 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/20c73c33a3b7b77bd6beaa98c127d630-config\") pod \"kube-apiserver-proxy-ip-10-0-139-48.ec2.internal\" (UID: \"20c73c33a3b7b77bd6beaa98c127d630\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.323258 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.323210 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.333552 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.333532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0f0fea8f3c0603deedbbe0331afef6c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal\" (UID: \"e0f0fea8f3c0603deedbbe0331afef6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.333620 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.333575 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0f0fea8f3c0603deedbbe0331afef6c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal\" (UID: \"e0f0fea8f3c0603deedbbe0331afef6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.333664 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.333622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0f0fea8f3c0603deedbbe0331afef6c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal\" (UID: \"e0f0fea8f3c0603deedbbe0331afef6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.333664 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.333642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/20c73c33a3b7b77bd6beaa98c127d630-config\") pod \"kube-apiserver-proxy-ip-10-0-139-48.ec2.internal\" (UID: \"20c73c33a3b7b77bd6beaa98c127d630\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.333735 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.333668 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/20c73c33a3b7b77bd6beaa98c127d630-config\") pod \"kube-apiserver-proxy-ip-10-0-139-48.ec2.internal\" (UID: \"20c73c33a3b7b77bd6beaa98c127d630\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.333735 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.333674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0f0fea8f3c0603deedbbe0331afef6c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal\" (UID: \"e0f0fea8f3c0603deedbbe0331afef6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.423983 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.423949 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.507549 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.507513 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.512054 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.512029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" Apr 23 08:48:09.524730 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.524713 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.625324 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.625259 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.725776 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.725749 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.826338 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.826305 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.836638 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.836612 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:48:09.836795 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.836777 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:48:09.836843 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.836794 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:48:09.908130 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.908059 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:09.922406 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.922357 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:43:08 +0000 UTC" deadline="2028-02-09 02:08:22.971145176 +0000 UTC" Apr 23 08:48:09.922406 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.922403 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15761h20m13.048745403s" Apr 23 08:48:09.926511 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:09.926489 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:09.928591 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.928566 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:48:09.941889 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.941866 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:48:09.961812 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.961789 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w9tb7" Apr 23 08:48:09.969881 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:09.969858 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w9tb7" Apr 23 08:48:10.025962 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:10.025934 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f0fea8f3c0603deedbbe0331afef6c.slice/crio-7a7aa8c9ef4343273782dedf9c1111cf105bd25c42ee707d9fab4cd67e3ad9ad WatchSource:0}: Error finding container 7a7aa8c9ef4343273782dedf9c1111cf105bd25c42ee707d9fab4cd67e3ad9ad: Status 404 returned error can't find the container with id 7a7aa8c9ef4343273782dedf9c1111cf105bd25c42ee707d9fab4cd67e3ad9ad Apr 23 08:48:10.026222 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:10.026201 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c73c33a3b7b77bd6beaa98c127d630.slice/crio-98b50955d106413d7a0baa4e696de22f97e832e11a17513029a0b0b43f0dfd65 WatchSource:0}: Error finding container 98b50955d106413d7a0baa4e696de22f97e832e11a17513029a0b0b43f0dfd65: Status 404 returned error can't find the container with id 98b50955d106413d7a0baa4e696de22f97e832e11a17513029a0b0b43f0dfd65 Apr 23 08:48:10.027317 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:10.027303 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-48.ec2.internal\" not found" Apr 23 08:48:10.029764 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.029749 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:48:10.068314 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.068263 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" event={"ID":"e0f0fea8f3c0603deedbbe0331afef6c","Type":"ContainerStarted","Data":"7a7aa8c9ef4343273782dedf9c1111cf105bd25c42ee707d9fab4cd67e3ad9ad"} Apr 23 08:48:10.069155 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.069131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" event={"ID":"20c73c33a3b7b77bd6beaa98c127d630","Type":"ContainerStarted","Data":"98b50955d106413d7a0baa4e696de22f97e832e11a17513029a0b0b43f0dfd65"} Apr 23 08:48:10.113003 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.112985 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:10.130285 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.130265 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" Apr 23 08:48:10.142461 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.142444 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:48:10.143528 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.143515 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" Apr 23 08:48:10.151991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.151978 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:48:10.747442 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.747414 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:10.910407 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.910357 2574 apiserver.go:52] "Watching apiserver" Apr 23 08:48:10.921502 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.921462 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:48:10.922705 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.922677 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-td2mf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2","openshift-dns/node-resolver-ffdbq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal","openshift-multus/multus-5zvll","openshift-multus/network-metrics-daemon-wnmff","openshift-network-diagnostics/network-check-target-xksm7","openshift-network-operator/iptables-alerter-hv5wx","kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal","openshift-cluster-node-tuning-operator/tuned-pvr56","openshift-image-registry/node-ca-px7gw","openshift-multus/multus-additional-cni-plugins-27dts","openshift-ovn-kubernetes/ovnkube-node-k9zxl"] Apr 23 08:48:10.926067 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.926046 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.927408 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.927364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:10.929050 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.928836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.929139 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.928931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.929515 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.929491 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jc9lv\"" Apr 23 08:48:10.929623 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.929573 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:48:10.929623 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.929573 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:48:10.929931 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.929915 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:48:10.930360 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.930339 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:48:10.930467 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.930380 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tnkm9\"" Apr 23 08:48:10.931605 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.931551 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:48:10.931605 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.931557 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:48:10.931764 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.931619 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:10.931764 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:10.931677 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:10.932851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.932832 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:10.932940 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:10.932918 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:10.933006 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.932927 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:10.933167 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.933150 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:48:10.933249 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.933234 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:48:10.933307 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.933254 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:48:10.933307 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.933266 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:48:10.933421 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.933360 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j9zjj\"" Apr 23 08:48:10.933421 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.933363 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-77bxt\"" Apr 23 08:48:10.933421 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.933410 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:48:10.934058 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.934003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:10.935532 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.935496 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:48:10.936346 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.936012 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:48:10.936462 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.936354 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:48:10.936535 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.936516 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:10.936908 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.936881 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5hdm4\"" Apr 23 08:48:10.937839 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.937817 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:48:10.937992 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.937974 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n2m64\"" Apr 23 08:48:10.938078 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.938061 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:48:10.939802 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.939014 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:48:10.939891 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.939817 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5lxjk\"" Apr 23 08:48:10.939988 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.939973 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:48:10.940543 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.940524 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:48:10.940887 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.940871 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:10.941206 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941187 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-sys\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.941281 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-device-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.941281 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941229 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-system-cni-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941281 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-cni-bin\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941281 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-kubelet\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-hostroot\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941353 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-multus-certs\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941369 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6s5k\" (UniqueName: \"kubernetes.io/projected/e556abe8-644b-4251-99a6-a109bfc8c173-kube-api-access-r6s5k\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941411 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-run\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vpx\" (UniqueName: \"kubernetes.io/projected/e45e878a-2315-478d-a452-625c243a29cc-kube-api-access-n7vpx\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdv4w\" (UniqueName: \"kubernetes.io/projected/fcf385b3-bad6-4ba8-ad14-e86462762f6a-kube-api-access-kdv4w\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-daemon-config\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e556abe8-644b-4251-99a6-a109bfc8c173-tmp-dir\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:10.941499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45e878a-2315-478d-a452-625c243a29cc-tmp\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-registration-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e556abe8-644b-4251-99a6-a109bfc8c173-hosts-file\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysconfig\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-lib-modules\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-var-lib-kubelet\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-socket-dir-parent\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-etc-selinux\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941737 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-cnibin\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-cni-binary-copy\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941777 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-etc-kubernetes\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj26n\" (UniqueName: \"kubernetes.io/projected/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-kube-api-access-gj26n\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:10.941893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941842 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-kubernetes\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-host\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941928 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e45e878a-2315-478d-a452-625c243a29cc-etc-tuned\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941950 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-socket-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.941978 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-sys-fs\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-os-release\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxrb\" (UniqueName: \"kubernetes.io/projected/b87e8c66-90ff-454c-9c82-3fe28797e8df-kube-api-access-zrxrb\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-modprobe-d\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysctl-conf\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-systemd\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-cni-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-k8s-cni-cncf-io\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942224 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-cni-multus\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-conf-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysctl-d\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:10.942568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.942324 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-netns\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:10.943282 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.943163 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:48:10.943381 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.943365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:48:10.943512 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.943493 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:10.943621 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.943605 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fqh4q\"" Apr 23 08:48:10.946006 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.945987 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:48:10.946095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.945987 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:48:10.947285 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.947269 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:48:10.947418 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.947296 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:48:10.947418 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.947300 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:48:10.947418 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.947353 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:48:10.947658 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.947613 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b8zhm\"" Apr 23 08:48:10.971453 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.971426 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:43:09 +0000 UTC" deadline="2027-10-28 06:46:45.555407877 +0000 UTC" Apr 23 08:48:10.971542 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:10.971454 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13269h58m34.583957853s" Apr 23 08:48:11.031156 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.031084 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:48:11.042979 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.042956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e45e878a-2315-478d-a452-625c243a29cc-etc-tuned\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.043116 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.042984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-os-release\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043116 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxrb\" (UniqueName: \"kubernetes.io/projected/b87e8c66-90ff-454c-9c82-3fe28797e8df-kube-api-access-zrxrb\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:11.043116 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovn-node-metrics-cert\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.043116 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbjs\" (UniqueName: \"kubernetes.io/projected/cf591c15-c055-4b83-8692-b39b3dd9ece5-kube-api-access-6tbjs\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.043116 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysctl-conf\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-os-release\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-cni-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-cni-multus\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysctl-conf\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043207 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-cni-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043224 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cni-binary-copy\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-cni-multus\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043258 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-var-lib-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-cni-netd\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.043367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-system-cni-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-cni-bin\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043408 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-kubelet\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-system-cni-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-hostroot\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-multus-certs\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-cni-bin\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-serviceca\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-var-lib-kubelet\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6s5k\" (UniqueName: \"kubernetes.io/projected/e556abe8-644b-4251-99a6-a109bfc8c173-kube-api-access-r6s5k\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-hostroot\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdv4w\" (UniqueName: \"kubernetes.io/projected/fcf385b3-bad6-4ba8-ad14-e86462762f6a-kube-api-access-kdv4w\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-multus-certs\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043633 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-daemon-config\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxwtc\" (UniqueName: \"kubernetes.io/projected/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-kube-api-access-pxwtc\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-os-release\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-kubelet\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.043925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-systemd\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-log-socket\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e556abe8-644b-4251-99a6-a109bfc8c173-tmp-dir\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.043939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45e878a-2315-478d-a452-625c243a29cc-tmp\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044225 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pggqs\" (UniqueName: \"kubernetes.io/projected/a619dfec-0cfa-46a8-9c86-7b06735feeaa-kube-api-access-pggqs\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044240 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e556abe8-644b-4251-99a6-a109bfc8c173-tmp-dir\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-env-overrides\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cf591c15-c055-4b83-8692-b39b3dd9ece5-iptables-alerter-script\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysconfig\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044356 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-cni-bin\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-cnibin\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovnkube-config\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovnkube-script-lib\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044443 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysconfig\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-cnibin\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.044778 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-kubernetes\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-host\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-daemon-config\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044543 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-host\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-socket-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-kubernetes\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-sys-fs\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-socket-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-sys-fs\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-modprobe-d\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-systemd\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-k8s-cni-cncf-io\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-conf-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-systemd\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-host\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044777 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-conf-dir\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.045569 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044782 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cnibin\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-ovn\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-modprobe-d\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-kube-api-access-tgbvp\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysctl-d\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-k8s-cni-cncf-io\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-netns\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-sys\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-device-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-systemd-units\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-etc-sysctl-d\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044964 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-node-log\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.044980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-run\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-sys\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045023 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vpx\" (UniqueName: \"kubernetes.io/projected/e45e878a-2315-478d-a452-625c243a29cc-kube-api-access-n7vpx\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-run\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-slash\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.046334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045070 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-host-run-netns\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-run-netns\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-registration-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e556abe8-644b-4251-99a6-a109bfc8c173-hosts-file\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e556abe8-644b-4251-99a6-a109bfc8c173-hosts-file\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045174 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-registration-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-device-dir\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-etc-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-lib-modules\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-var-lib-kubelet\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-var-lib-kubelet\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-socket-dir-parent\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf591c15-c055-4b83-8692-b39b3dd9ece5-host-slash\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e45e878a-2315-478d-a452-625c243a29cc-lib-modules\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.047065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045581 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-etc-selinux\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045641 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-multus-socket-dir-parent\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-cni-binary-copy\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-etc-kubernetes\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj26n\" (UniqueName: \"kubernetes.io/projected/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-kube-api-access-gj26n\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.045754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ee9e598b-755e-47f8-95fd-9e72a79972ac-agent-certs\") pod \"konnectivity-agent-td2mf\" (UID: \"ee9e598b-755e-47f8-95fd-9e72a79972ac\") " pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.046608 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ee9e598b-755e-47f8-95fd-9e72a79972ac-konnectivity-ca\") pod \"konnectivity-agent-td2mf\" (UID: \"ee9e598b-755e-47f8-95fd-9e72a79972ac\") " pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.046806 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-system-cni-dir\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.046898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-etc-kubernetes\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.046944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fcf385b3-bad6-4ba8-ad14-e86462762f6a-etc-selinux\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.047181 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.047277 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:48:11.547228755 +0000 UTC m=+3.064088068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:11.047791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.047677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-cni-binary-copy\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.048350 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.047799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e45e878a-2315-478d-a452-625c243a29cc-etc-tuned\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.048350 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.047901 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45e878a-2315-478d-a452-625c243a29cc-tmp\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.053046 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.053021 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxrb\" (UniqueName: \"kubernetes.io/projected/b87e8c66-90ff-454c-9c82-3fe28797e8df-kube-api-access-zrxrb\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:11.053324 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.053307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdv4w\" (UniqueName: \"kubernetes.io/projected/fcf385b3-bad6-4ba8-ad14-e86462762f6a-kube-api-access-kdv4w\") pod \"aws-ebs-csi-driver-node-7lpk2\" (UID: \"fcf385b3-bad6-4ba8-ad14-e86462762f6a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.053606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.053584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6s5k\" (UniqueName: \"kubernetes.io/projected/e556abe8-644b-4251-99a6-a109bfc8c173-kube-api-access-r6s5k\") pod \"node-resolver-ffdbq\" (UID: \"e556abe8-644b-4251-99a6-a109bfc8c173\") " pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:11.056195 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.056094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vpx\" (UniqueName: \"kubernetes.io/projected/e45e878a-2315-478d-a452-625c243a29cc-kube-api-access-n7vpx\") pod \"tuned-pvr56\" (UID: \"e45e878a-2315-478d-a452-625c243a29cc\") " pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.058145 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.058123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj26n\" (UniqueName: \"kubernetes.io/projected/3e5f1db4-0744-40e3-98f8-2f123ac8d9ab-kube-api-access-gj26n\") pod \"multus-5zvll\" (UID: \"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab\") " pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.147238 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147176 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.147238 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf591c15-c055-4b83-8692-b39b3dd9ece5-host-slash\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ee9e598b-755e-47f8-95fd-9e72a79972ac-agent-certs\") pod \"konnectivity-agent-td2mf\" (UID: \"ee9e598b-755e-47f8-95fd-9e72a79972ac\") " pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ee9e598b-755e-47f8-95fd-9e72a79972ac-konnectivity-ca\") pod \"konnectivity-agent-td2mf\" (UID: \"ee9e598b-755e-47f8-95fd-9e72a79972ac\") " pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147305 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-system-cni-dir\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovn-node-metrics-cert\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbjs\" (UniqueName: \"kubernetes.io/projected/cf591c15-c055-4b83-8692-b39b3dd9ece5-kube-api-access-6tbjs\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf591c15-c055-4b83-8692-b39b3dd9ece5-host-slash\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cni-binary-copy\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-var-lib-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-cni-netd\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-serviceca\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxwtc\" (UniqueName: \"kubernetes.io/projected/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-kube-api-access-pxwtc\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-os-release\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.148005 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-kubelet\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-systemd\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147725 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-var-lib-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-log-socket\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pggqs\" (UniqueName: \"kubernetes.io/projected/a619dfec-0cfa-46a8-9c86-7b06735feeaa-kube-api-access-pggqs\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147789 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-env-overrides\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147812 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cf591c15-c055-4b83-8692-b39b3dd9ece5-iptables-alerter-script\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-cni-bin\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovnkube-config\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovnkube-script-lib\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147927 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cni-binary-copy\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-cni-netd\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-host\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-system-cni-dir\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cnibin\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.149042 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-ovn\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-kubelet\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148074 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-host\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148080 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-systemd\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.147927 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ee9e598b-755e-47f8-95fd-9e72a79972ac-konnectivity-ca\") pod \"konnectivity-agent-td2mf\" (UID: \"ee9e598b-755e-47f8-95fd-9e72a79972ac\") " pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148120 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-cnibin\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-log-socket\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-kube-api-access-tgbvp\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148235 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-systemd-units\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-node-log\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-slash\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-run-ovn\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-run-netns\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-cni-bin\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148364 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-systemd-units\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.149858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148382 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-node-log\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cf591c15-c055-4b83-8692-b39b3dd9ece5-iptables-alerter-script\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148469 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-run-netns\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-host-slash\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148573 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-env-overrides\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-etc-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148619 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovnkube-script-lib\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a619dfec-0cfa-46a8-9c86-7b06735feeaa-os-release\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovnkube-config\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-etc-openvswitch\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a619dfec-0cfa-46a8-9c86-7b06735feeaa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.150505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.148993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-serviceca\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.151313 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.151296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-ovn-node-metrics-cert\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.151902 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.151886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ee9e598b-755e-47f8-95fd-9e72a79972ac-agent-certs\") pod \"konnectivity-agent-td2mf\" (UID: \"ee9e598b-755e-47f8-95fd-9e72a79972ac\") " pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:11.169567 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.169526 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:11.169567 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.169561 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:11.169567 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.169574 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qhfwl for pod openshift-network-diagnostics/network-check-target-xksm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:11.169851 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.169630 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl podName:96a803a2-c0b4-4975-b7d2-ca3157aa9f00 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:11.669616043 +0000 UTC m=+3.186475373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qhfwl" (UniqueName: "kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl") pod "network-check-target-xksm7" (UID: "96a803a2-c0b4-4975-b7d2-ca3157aa9f00") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:11.171878 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.171854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pggqs\" (UniqueName: \"kubernetes.io/projected/a619dfec-0cfa-46a8-9c86-7b06735feeaa-kube-api-access-pggqs\") pod \"multus-additional-cni-plugins-27dts\" (UID: \"a619dfec-0cfa-46a8-9c86-7b06735feeaa\") " pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.177105 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.177085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/7f9090b7-c998-4b54-9c85-0df5e37b4d9d-kube-api-access-tgbvp\") pod \"ovnkube-node-k9zxl\" (UID: \"7f9090b7-c998-4b54-9c85-0df5e37b4d9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.178296 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.178273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbjs\" (UniqueName: \"kubernetes.io/projected/cf591c15-c055-4b83-8692-b39b3dd9ece5-kube-api-access-6tbjs\") pod \"iptables-alerter-hv5wx\" (UID: \"cf591c15-c055-4b83-8692-b39b3dd9ece5\") " pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.178658 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.178640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxwtc\" (UniqueName: \"kubernetes.io/projected/3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0-kube-api-access-pxwtc\") pod \"node-ca-px7gw\" (UID: \"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0\") " pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.237804 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.237759 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pvr56" Apr 23 08:48:11.245562 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.245537 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ffdbq" Apr 23 08:48:11.254229 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.254208 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" Apr 23 08:48:11.258856 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.258837 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5zvll" Apr 23 08:48:11.266371 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.266350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hv5wx" Apr 23 08:48:11.271965 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.271946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:11.280477 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.280455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-px7gw" Apr 23 08:48:11.286019 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.285970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-27dts" Apr 23 08:48:11.291592 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.291575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:11.434465 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.434431 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:11.551769 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.551682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:11.551915 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.551828 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:11.551915 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.551897 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:48:12.551878211 +0000 UTC m=+4.068737531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:11.714665 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:11.714639 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf385b3_bad6_4ba8_ad14_e86462762f6a.slice/crio-373c91524b02d54e0b858c46011227f608e17bd586e2dd76821450f1793c862b WatchSource:0}: Error finding container 373c91524b02d54e0b858c46011227f608e17bd586e2dd76821450f1793c862b: Status 404 returned error can't find the container with id 373c91524b02d54e0b858c46011227f608e17bd586e2dd76821450f1793c862b Apr 23 08:48:11.714855 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:11.714835 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1aa913_059c_4ed1_9c77_7a25bcbdb7f0.slice/crio-b2d7432ddd2c1c87e3a719e6cf7faf68dd0dd8d7c2a04f44580d45df8bb84176 WatchSource:0}: Error finding container b2d7432ddd2c1c87e3a719e6cf7faf68dd0dd8d7c2a04f44580d45df8bb84176: Status 404 returned error can't find the container with id b2d7432ddd2c1c87e3a719e6cf7faf68dd0dd8d7c2a04f44580d45df8bb84176 Apr 23 08:48:11.715588 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:11.715561 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf591c15_c055_4b83_8692_b39b3dd9ece5.slice/crio-137cc1c006d0f384961a55d04e23a231151a53e30efc6048e7401f9294706366 WatchSource:0}: Error finding container 137cc1c006d0f384961a55d04e23a231151a53e30efc6048e7401f9294706366: Status 404 returned error can't find the container with id 137cc1c006d0f384961a55d04e23a231151a53e30efc6048e7401f9294706366 Apr 23 08:48:11.722791 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:48:11.722475 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f9090b7_c998_4b54_9c85_0df5e37b4d9d.slice/crio-fa19631910058bfd0d209b33dbc35a12148139933a02ab8a7f7d6ac468c9f4bc WatchSource:0}: Error finding container fa19631910058bfd0d209b33dbc35a12148139933a02ab8a7f7d6ac468c9f4bc: Status 404 returned error can't find the container with id fa19631910058bfd0d209b33dbc35a12148139933a02ab8a7f7d6ac468c9f4bc Apr 23 08:48:11.752878 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.752854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:11.755654 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.753303 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:11.755654 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.753338 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:11.755654 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.753365 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qhfwl for pod openshift-network-diagnostics/network-check-target-xksm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:11.755654 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:11.753479 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl podName:96a803a2-c0b4-4975-b7d2-ca3157aa9f00 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:12.7534451 +0000 UTC m=+4.270304424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhfwl" (UniqueName: "kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl") pod "network-check-target-xksm7" (UID: "96a803a2-c0b4-4975-b7d2-ca3157aa9f00") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:11.971748 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.971705 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:43:09 +0000 UTC" deadline="2027-10-24 20:12:15.65969408 +0000 UTC" Apr 23 08:48:11.971748 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:11.971744 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13187h24m3.687953798s" Apr 23 08:48:12.076704 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.075970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"fa19631910058bfd0d209b33dbc35a12148139933a02ab8a7f7d6ac468c9f4bc"} Apr 23 08:48:12.080581 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.080545 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" event={"ID":"fcf385b3-bad6-4ba8-ad14-e86462762f6a","Type":"ContainerStarted","Data":"373c91524b02d54e0b858c46011227f608e17bd586e2dd76821450f1793c862b"} Apr 23 08:48:12.084511 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.084461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" event={"ID":"20c73c33a3b7b77bd6beaa98c127d630","Type":"ContainerStarted","Data":"d5dd72480883b907b3b5eff279abb5ed6f926d521d1a7b8cf2c400a173d9bb65"} Apr 23 08:48:12.089041 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.089014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ffdbq" event={"ID":"e556abe8-644b-4251-99a6-a109bfc8c173","Type":"ContainerStarted","Data":"b07adcd544af09f096211a5f21a701b362ef34a5fb50d1140fd5d89e8d35f8f6"} Apr 23 08:48:12.091830 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.091804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pvr56" event={"ID":"e45e878a-2315-478d-a452-625c243a29cc","Type":"ContainerStarted","Data":"eabb56860cc0c92dafbd9a08b2df89eccc0822885eeab86e7141a7bd1334676c"} Apr 23 08:48:12.099591 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.099432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zvll" event={"ID":"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab","Type":"ContainerStarted","Data":"a3e632bf1615a211d38daf90fa634322e69744da622facbc7076eed9964f55fb"} Apr 23 08:48:12.100629 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.100334 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-48.ec2.internal" podStartSLOduration=2.10031972 podStartE2EDuration="2.10031972s" podCreationTimestamp="2026-04-23 08:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:48:12.099574246 +0000 UTC m=+3.616433582" watchObservedRunningTime="2026-04-23 08:48:12.10031972 +0000 UTC m=+3.617179056" Apr 23 08:48:12.105793 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.105561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerStarted","Data":"52a80a29f0a4070ede30df4a23f413959c2b2eb427abaf11d5449ba79819e389"} Apr 23 08:48:12.116125 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.116092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hv5wx" event={"ID":"cf591c15-c055-4b83-8692-b39b3dd9ece5","Type":"ContainerStarted","Data":"137cc1c006d0f384961a55d04e23a231151a53e30efc6048e7401f9294706366"} Apr 23 08:48:12.131351 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.131215 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-px7gw" event={"ID":"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0","Type":"ContainerStarted","Data":"b2d7432ddd2c1c87e3a719e6cf7faf68dd0dd8d7c2a04f44580d45df8bb84176"} Apr 23 08:48:12.134679 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.134630 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-td2mf" event={"ID":"ee9e598b-755e-47f8-95fd-9e72a79972ac","Type":"ContainerStarted","Data":"56befb6eeeb68988910f8f9fa8ca23866d7663100b2576ef7713721b3e8c3a35"} Apr 23 08:48:12.559990 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.559907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:12.560127 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:12.560059 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:12.560171 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:12.560134 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:48:14.560112031 +0000 UTC m=+6.076971344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:12.763696 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:12.763062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:12.763696 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:12.763238 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:12.763696 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:12.763257 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:12.763696 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:12.763280 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qhfwl for pod openshift-network-diagnostics/network-check-target-xksm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:12.763696 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:12.763336 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl podName:96a803a2-c0b4-4975-b7d2-ca3157aa9f00 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:14.763317933 +0000 UTC m=+6.280177252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhfwl" (UniqueName: "kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl") pod "network-check-target-xksm7" (UID: "96a803a2-c0b4-4975-b7d2-ca3157aa9f00") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:13.068084 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:13.068053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:13.068478 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:13.068179 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:13.068629 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:13.068608 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:13.068734 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:13.068714 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:13.149190 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:13.149099 2574 generic.go:358] "Generic (PLEG): container finished" podID="e0f0fea8f3c0603deedbbe0331afef6c" containerID="d615a92ab297617d73c33678b83921e46a95ba04acce44fc23441d3224ee1166" exitCode=0 Apr 23 08:48:13.150088 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:13.150031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" event={"ID":"e0f0fea8f3c0603deedbbe0331afef6c","Type":"ContainerDied","Data":"d615a92ab297617d73c33678b83921e46a95ba04acce44fc23441d3224ee1166"} Apr 23 08:48:14.157297 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:14.157260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" event={"ID":"e0f0fea8f3c0603deedbbe0331afef6c","Type":"ContainerStarted","Data":"f729257e276eddc82a226ebdf782ce6f8c7084b8c5c6159e2ec4ee9fa16cb754"} Apr 23 08:48:14.578721 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:14.578644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:14.578875 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:14.578796 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:14.578875 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:14.578860 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:48:18.578839859 +0000 UTC m=+10.095699189 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:14.780301 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:14.780261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:14.780504 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:14.780481 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:14.780572 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:14.780503 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:14.780572 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:14.780543 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qhfwl for pod openshift-network-diagnostics/network-check-target-xksm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:14.780678 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:14.780601 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl podName:96a803a2-c0b4-4975-b7d2-ca3157aa9f00 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:18.780582703 +0000 UTC m=+10.297442021 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhfwl" (UniqueName: "kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl") pod "network-check-target-xksm7" (UID: "96a803a2-c0b4-4975-b7d2-ca3157aa9f00") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:15.065337 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:15.065256 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:15.065514 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:15.065408 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:15.065514 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:15.065418 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:15.065514 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:15.065500 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:17.067788 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:17.067752 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:17.068192 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:17.067871 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:17.068192 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:17.068182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:17.068284 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:17.068251 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:18.615893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:18.615854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:18.616357 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:18.616019 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:18.616357 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:18.616080 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:48:26.616061511 +0000 UTC m=+18.132920839 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:18.816709 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:18.816670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:18.816873 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:18.816818 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:18.816873 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:18.816833 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:18.816873 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:18.816842 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qhfwl for pod openshift-network-diagnostics/network-check-target-xksm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:18.817047 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:18.816905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl podName:96a803a2-c0b4-4975-b7d2-ca3157aa9f00 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:26.816877397 +0000 UTC m=+18.333736725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhfwl" (UniqueName: "kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl") pod "network-check-target-xksm7" (UID: "96a803a2-c0b4-4975-b7d2-ca3157aa9f00") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:19.066851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:19.066762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:19.067002 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:19.066890 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:19.067267 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:19.067242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:19.067356 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:19.067339 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:21.065655 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:21.065615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:21.066018 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:21.065760 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:21.066018 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:21.065628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:21.066018 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:21.065857 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:23.066107 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:23.066071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:23.066590 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:23.066201 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:23.066590 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:23.066071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:23.066701 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:23.066665 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:25.065747 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:25.065550 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:25.066278 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:25.065552 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:25.066278 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:25.065847 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:25.066278 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:25.065927 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:26.676424 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:26.676359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:26.676987 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:26.676518 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:26.676987 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:26.676589 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:48:42.676567912 +0000 UTC m=+34.193427227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:26.877851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:26.877818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:26.878024 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:26.877991 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:26.878024 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:26.878012 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:26.878024 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:26.878022 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qhfwl for pod openshift-network-diagnostics/network-check-target-xksm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:26.878160 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:26.878073 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl podName:96a803a2-c0b4-4975-b7d2-ca3157aa9f00 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:42.87805859 +0000 UTC m=+34.394917904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhfwl" (UniqueName: "kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl") pod "network-check-target-xksm7" (UID: "96a803a2-c0b4-4975-b7d2-ca3157aa9f00") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:27.066237 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:27.066157 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:27.066237 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:27.066211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:27.066487 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:27.066285 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:27.066487 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:27.066432 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:29.066118 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.066097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:29.066877 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:29.066201 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:29.066877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.066250 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:29.066877 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:29.066323 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:29.185620 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.185376 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ffdbq" event={"ID":"e556abe8-644b-4251-99a6-a109bfc8c173","Type":"ContainerStarted","Data":"c985254636ff083a8ac963827af24a8dbfe3c61e7372af90164c2051834c3530"} Apr 23 08:48:29.189588 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.189559 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pvr56" event={"ID":"e45e878a-2315-478d-a452-625c243a29cc","Type":"ContainerStarted","Data":"755487403e31d151a46e171ccad775c5a362e577b5fc5026bfeacbd4395126c5"} Apr 23 08:48:29.190939 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.190914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zvll" event={"ID":"3e5f1db4-0744-40e3-98f8-2f123ac8d9ab","Type":"ContainerStarted","Data":"92902c74e0a44e6e23220d990ea0244628c6a82d8151367442f911859bbb8bf1"} Apr 23 08:48:29.192273 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.192247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerStarted","Data":"b73d3d8556a454e2a6f51ab135bd89dd5daec51d85a4716cd6c8098b8539ccf0"} Apr 23 08:48:29.193547 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.193527 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-px7gw" event={"ID":"3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0","Type":"ContainerStarted","Data":"c5b7c37c28792fda804d80c23591c6511450f473bf5af7dba9eb54482bbef7a3"} Apr 23 08:48:29.194649 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.194623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-td2mf" event={"ID":"ee9e598b-755e-47f8-95fd-9e72a79972ac","Type":"ContainerStarted","Data":"da02970123836ad50745742cb9f35d563ad46a50072f15712ef2faef0740e188"} Apr 23 08:48:29.195756 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.195738 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"9b40d4a846dcf4ae9c9dfb87f6c075acb54213410bca85798e737910dfc58a17"} Apr 23 08:48:29.196818 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.196802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" event={"ID":"fcf385b3-bad6-4ba8-ad14-e86462762f6a","Type":"ContainerStarted","Data":"c143a1f6a57b59b86396dde187aae51a4e17314020ca5607716256f1a94faf12"} Apr 23 08:48:29.204270 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.204238 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ffdbq" podStartSLOduration=3.19519077 podStartE2EDuration="20.204228121s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.725877544 +0000 UTC m=+3.242736858" lastFinishedPulling="2026-04-23 08:48:28.734914894 +0000 UTC m=+20.251774209" observedRunningTime="2026-04-23 08:48:29.203926534 +0000 UTC m=+20.720785859" watchObservedRunningTime="2026-04-23 08:48:29.204228121 +0000 UTC m=+20.721087456" Apr 23 08:48:29.204353 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.204297 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-48.ec2.internal" podStartSLOduration=19.204293799 podStartE2EDuration="19.204293799s" podCreationTimestamp="2026-04-23 08:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:48:14.175306391 +0000 UTC m=+5.692165726" watchObservedRunningTime="2026-04-23 08:48:29.204293799 +0000 UTC m=+20.721153142" Apr 23 08:48:29.219154 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.219121 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pvr56" podStartSLOduration=3.19868845 podStartE2EDuration="20.219111565s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.724863375 +0000 UTC m=+3.241722688" lastFinishedPulling="2026-04-23 08:48:28.745286491 +0000 UTC m=+20.262145803" observedRunningTime="2026-04-23 08:48:29.219042937 +0000 UTC m=+20.735902271" watchObservedRunningTime="2026-04-23 08:48:29.219111565 +0000 UTC m=+20.735970900" Apr 23 08:48:29.234590 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.234551 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5zvll" podStartSLOduration=3.152844672 podStartE2EDuration="20.234542013s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.722734637 +0000 UTC m=+3.239593962" lastFinishedPulling="2026-04-23 08:48:28.804431973 +0000 UTC m=+20.321291303" observedRunningTime="2026-04-23 08:48:29.234333368 +0000 UTC m=+20.751192702" watchObservedRunningTime="2026-04-23 08:48:29.234542013 +0000 UTC m=+20.751401347" Apr 23 08:48:29.249212 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.249166 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-td2mf" podStartSLOduration=3.235639684 podStartE2EDuration="20.24915198s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.726694164 +0000 UTC m=+3.243553481" lastFinishedPulling="2026-04-23 08:48:28.740206452 +0000 UTC m=+20.257065777" observedRunningTime="2026-04-23 08:48:29.248231988 +0000 UTC m=+20.765091319" watchObservedRunningTime="2026-04-23 08:48:29.24915198 +0000 UTC m=+20.766011317" Apr 23 08:48:29.262030 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:29.261996 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-px7gw" podStartSLOduration=11.037890269 podStartE2EDuration="20.261987541s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.717497346 +0000 UTC m=+3.234356673" lastFinishedPulling="2026-04-23 08:48:20.941594628 +0000 UTC m=+12.458453945" observedRunningTime="2026-04-23 08:48:29.261853198 +0000 UTC m=+20.778712534" watchObservedRunningTime="2026-04-23 08:48:29.261987541 +0000 UTC m=+20.778846875" Apr 23 08:48:30.200000 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.199810 2574 generic.go:358] "Generic (PLEG): container finished" podID="a619dfec-0cfa-46a8-9c86-7b06735feeaa" containerID="b73d3d8556a454e2a6f51ab135bd89dd5daec51d85a4716cd6c8098b8539ccf0" exitCode=0 Apr 23 08:48:30.200476 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.199890 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerDied","Data":"b73d3d8556a454e2a6f51ab135bd89dd5daec51d85a4716cd6c8098b8539ccf0"} Apr 23 08:48:30.201669 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.201647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hv5wx" event={"ID":"cf591c15-c055-4b83-8692-b39b3dd9ece5","Type":"ContainerStarted","Data":"3d080a5c6c81600e8214c22c90bd9eb5117f37d4f90f3433f00eb8626d86fa1b"} Apr 23 08:48:30.204635 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.204618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:48:30.204943 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.204923 2574 generic.go:358] "Generic (PLEG): container finished" podID="7f9090b7-c998-4b54-9c85-0df5e37b4d9d" containerID="77bcc57ce8963770ae24cc703aa7ed8ddc4013e637f035039a7f3b19686d0349" exitCode=1 Apr 23 08:48:30.205026 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.204960 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"642ca667a89306561d83d1a6cbecb0cbfe720c95c96c108fb97d92d34f83c55e"} Apr 23 08:48:30.205026 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.204986 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"856c86079d851a501b3b8fb0271a550ebd84f27f9877eb8876d970b326719abd"} Apr 23 08:48:30.205026 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.205001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"5f6cc35a9e5a4dbeb7dce1afac8c1755ad1ce1c6d443133d249643adfc46a61d"} Apr 23 08:48:30.205026 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.205012 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"8a611799274385773872a290dd33ac491311e2799dcf17025fef2927633639d7"} Apr 23 08:48:30.205026 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.205025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerDied","Data":"77bcc57ce8963770ae24cc703aa7ed8ddc4013e637f035039a7f3b19686d0349"} Apr 23 08:48:30.237973 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.237919 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hv5wx" podStartSLOduration=4.216803244 podStartE2EDuration="21.23790251s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.719571375 +0000 UTC m=+3.236430689" lastFinishedPulling="2026-04-23 08:48:28.74067063 +0000 UTC m=+20.257529955" observedRunningTime="2026-04-23 08:48:30.237520517 +0000 UTC m=+21.754379852" watchObservedRunningTime="2026-04-23 08:48:30.23790251 +0000 UTC m=+21.754761845" Apr 23 08:48:30.315762 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:30.315739 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:48:31.003676 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.003476 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:48:30.315758029Z","UUID":"a1fb3e0c-4af3-48c6-9d03-2b801badea36","Handler":null,"Name":"","Endpoint":""} Apr 23 08:48:31.005191 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.005170 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:48:31.005321 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.005198 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:48:31.066438 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.066338 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:31.066607 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:31.066469 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:31.066877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.066854 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:31.067004 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:31.066962 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:31.208128 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.208091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" event={"ID":"fcf385b3-bad6-4ba8-ad14-e86462762f6a","Type":"ContainerStarted","Data":"9c71e1e512ed31c95b011127c22f6f0018043e9f065e8c5c31d2f60b63512823"} Apr 23 08:48:31.803688 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.803603 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:31.804208 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:31.804184 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:32.213243 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:32.213211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:48:32.213690 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:32.213607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"86450badc65aac49966a8a3e43afe32f378b5c9801be14e59a538dbd237a3728"} Apr 23 08:48:32.215798 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:32.215772 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" event={"ID":"fcf385b3-bad6-4ba8-ad14-e86462762f6a","Type":"ContainerStarted","Data":"8113c392d1ca0c42d99edc81469ace188780ab9a66b8776d522ebc0be5971e9a"} Apr 23 08:48:32.216017 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:32.216000 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:32.216575 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:32.216554 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-td2mf" Apr 23 08:48:32.234468 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:32.234432 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7lpk2" podStartSLOduration=3.545036073 podStartE2EDuration="23.234420839s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.716536757 +0000 UTC m=+3.233396069" lastFinishedPulling="2026-04-23 08:48:31.405921519 +0000 UTC m=+22.922780835" observedRunningTime="2026-04-23 08:48:32.234076239 +0000 UTC m=+23.750935572" watchObservedRunningTime="2026-04-23 08:48:32.234420839 +0000 UTC m=+23.751280307" Apr 23 08:48:33.065914 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:33.065838 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:33.065914 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:33.065844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:33.066114 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:33.066013 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:33.066159 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:33.066113 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:34.222212 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:34.222049 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:48:34.224583 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:34.224551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerStarted","Data":"aee75e00cdf3807c7f26e891ea12dbf13e053bd412d8e7d3d65d67fa0d03fa33"} Apr 23 08:48:35.065697 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.065665 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:35.065858 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.065666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:35.065858 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:35.065762 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:35.065858 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:35.065840 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:35.227244 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.227213 2574 generic.go:358] "Generic (PLEG): container finished" podID="a619dfec-0cfa-46a8-9c86-7b06735feeaa" containerID="aee75e00cdf3807c7f26e891ea12dbf13e053bd412d8e7d3d65d67fa0d03fa33" exitCode=0 Apr 23 08:48:35.227690 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.227282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerDied","Data":"aee75e00cdf3807c7f26e891ea12dbf13e053bd412d8e7d3d65d67fa0d03fa33"} Apr 23 08:48:35.230466 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.230450 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:48:35.230780 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.230758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"f0aac889faf7fc0c621da3ea1ab88dfa744d8e79fd53c607207b6eb53150c7a7"} Apr 23 08:48:35.231041 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.231026 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:35.231091 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.231054 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:35.231202 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.231180 2574 scope.go:117] "RemoveContainer" containerID="77bcc57ce8963770ae24cc703aa7ed8ddc4013e637f035039a7f3b19686d0349" Apr 23 08:48:35.249033 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.249011 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:35.249520 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.249504 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:35.264902 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:35.264886 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:48:36.121689 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.121661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xksm7"] Apr 23 08:48:36.121845 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.121794 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:36.121921 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:36.121893 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:36.124764 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.124737 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wnmff"] Apr 23 08:48:36.124862 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.124842 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:36.124981 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:36.124958 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:36.234840 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.234757 2574 generic.go:358] "Generic (PLEG): container finished" podID="a619dfec-0cfa-46a8-9c86-7b06735feeaa" containerID="8456d06e37f9e1edef0aca82bef6c646dd5e9924361128d77cd4e84b0f7546f4" exitCode=0 Apr 23 08:48:36.235507 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.234848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerDied","Data":"8456d06e37f9e1edef0aca82bef6c646dd5e9924361128d77cd4e84b0f7546f4"} Apr 23 08:48:36.238114 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.238095 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:48:36.238411 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.238378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" event={"ID":"7f9090b7-c998-4b54-9c85-0df5e37b4d9d","Type":"ContainerStarted","Data":"a59875ba53d37657f20db95b3ed41c3ead3d1d092973a44957701912a45647f3"} Apr 23 08:48:36.282123 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:36.282084 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" podStartSLOduration=9.944302877 podStartE2EDuration="27.282072679s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.724916083 +0000 UTC m=+3.241775399" lastFinishedPulling="2026-04-23 08:48:29.062685882 +0000 UTC m=+20.579545201" observedRunningTime="2026-04-23 08:48:36.282020309 +0000 UTC m=+27.798879644" watchObservedRunningTime="2026-04-23 08:48:36.282072679 +0000 UTC m=+27.798932014" Apr 23 08:48:37.241847 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:37.241817 2574 generic.go:358] "Generic (PLEG): container finished" podID="a619dfec-0cfa-46a8-9c86-7b06735feeaa" containerID="462fdefdd4137af95654aaa80b55c04c4b63ee6d82dbe45b424f737dd6836429" exitCode=0 Apr 23 08:48:37.242276 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:37.241913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerDied","Data":"462fdefdd4137af95654aaa80b55c04c4b63ee6d82dbe45b424f737dd6836429"} Apr 23 08:48:38.065641 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:38.065606 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:38.065827 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:38.065606 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:38.065827 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:38.065738 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:38.065827 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:38.065785 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:40.065736 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:40.065704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:40.065736 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:40.065734 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:40.066353 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:40.065821 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:40.066353 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:40.065958 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:42.065885 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.065641 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:42.066334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.065641 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:42.066334 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.065992 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xksm7" podUID="96a803a2-c0b4-4975-b7d2-ca3157aa9f00" Apr 23 08:48:42.066334 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.066080 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:48:42.692595 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.692553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:42.692792 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.692681 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:42.692792 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.692743 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:49:14.692727062 +0000 UTC m=+66.209586378 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:42.840505 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.840441 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-48.ec2.internal" event="NodeReady" Apr 23 08:48:42.840631 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.840545 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:48:42.883913 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.883881 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fxlql"] Apr 23 08:48:42.894102 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.894075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:42.894242 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.894225 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:42.894297 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.894245 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:42.894297 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.894257 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qhfwl for pod openshift-network-diagnostics/network-check-target-xksm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:42.894359 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:42.894311 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl podName:96a803a2-c0b4-4975-b7d2-ca3157aa9f00 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:14.89429438 +0000 UTC m=+66.411153699 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qhfwl" (UniqueName: "kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl") pod "network-check-target-xksm7" (UID: "96a803a2-c0b4-4975-b7d2-ca3157aa9f00") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:42.900759 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.900738 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pnnnr"] Apr 23 08:48:42.900923 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.900902 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:42.903609 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.903571 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:48:42.903723 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.903666 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hskmq\"" Apr 23 08:48:42.903788 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.903717 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:48:42.914986 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.914969 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fxlql"] Apr 23 08:48:42.914986 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.914988 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pnnnr"] Apr 23 08:48:42.915102 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.915062 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:42.917649 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.917627 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:48:42.917733 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.917681 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:48:42.917733 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.917691 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:48:42.917733 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.917683 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-55hsm\"" Apr 23 08:48:42.994623 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.994597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:42.994745 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.994628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99g5l\" (UniqueName: \"kubernetes.io/projected/67ac0c42-3257-496f-9fbd-98d7b980d7d5-kube-api-access-99g5l\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:42.994745 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.994661 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmh9n\" (UniqueName: \"kubernetes.io/projected/85967c1f-fcc5-477d-949f-62d16a82fb18-kube-api-access-kmh9n\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:42.994745 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.994696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ac0c42-3257-496f-9fbd-98d7b980d7d5-config-volume\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:42.994745 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.994713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67ac0c42-3257-496f-9fbd-98d7b980d7d5-tmp-dir\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:42.994745 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:42.994746 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:43.095437 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.095371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.095440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.095469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99g5l\" (UniqueName: \"kubernetes.io/projected/67ac0c42-3257-496f-9fbd-98d7b980d7d5-kube-api-access-99g5l\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.095504 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.095553 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:43.595539571 +0000 UTC m=+35.112398885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.095568 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.095507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmh9n\" (UniqueName: \"kubernetes.io/projected/85967c1f-fcc5-477d-949f-62d16a82fb18-kube-api-access-kmh9n\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.095611 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:43.595598108 +0000 UTC m=+35.112457421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.095650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ac0c42-3257-496f-9fbd-98d7b980d7d5-config-volume\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.095793 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.095671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67ac0c42-3257-496f-9fbd-98d7b980d7d5-tmp-dir\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.096081 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.095886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67ac0c42-3257-496f-9fbd-98d7b980d7d5-tmp-dir\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.096181 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.096154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ac0c42-3257-496f-9fbd-98d7b980d7d5-config-volume\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.106125 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.106101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99g5l\" (UniqueName: \"kubernetes.io/projected/67ac0c42-3257-496f-9fbd-98d7b980d7d5-kube-api-access-99g5l\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.106210 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.106189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmh9n\" (UniqueName: \"kubernetes.io/projected/85967c1f-fcc5-477d-949f-62d16a82fb18-kube-api-access-kmh9n\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:43.599295 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.599265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:43.599470 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:43.599333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:43.599470 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.599441 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:43.599470 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.599440 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:43.599585 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.599491 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:44.599478085 +0000 UTC m=+36.116337403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:48:43.599585 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:43.599504 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:44.599498829 +0000 UTC m=+36.116358141 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:48:44.066293 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.066258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:48:44.066492 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.066258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:48:44.069231 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.069200 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:48:44.070310 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.070291 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zffjs\"" Apr 23 08:48:44.070442 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.070374 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:48:44.070442 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.070421 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:48:44.070530 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.070456 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h75n2\"" Apr 23 08:48:44.256431 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.256404 2574 generic.go:358] "Generic (PLEG): container finished" podID="a619dfec-0cfa-46a8-9c86-7b06735feeaa" containerID="50acb1a9fcfd701cebacfd5b14d08996f391bfb1d252432c7050175033fd263b" exitCode=0 Apr 23 08:48:44.256853 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.256438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerDied","Data":"50acb1a9fcfd701cebacfd5b14d08996f391bfb1d252432c7050175033fd263b"} Apr 23 08:48:44.606877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.606814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:44.606877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:44.606876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:44.607015 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:44.606953 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:44.607015 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:44.606955 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:44.607015 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:44.607000 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:46.606987041 +0000 UTC m=+38.123846354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:48:44.607015 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:44.607012 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:46.607006477 +0000 UTC m=+38.123865790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:48:45.260530 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:45.260497 2574 generic.go:358] "Generic (PLEG): container finished" podID="a619dfec-0cfa-46a8-9c86-7b06735feeaa" containerID="21c7249e7213e1b7587b6cc02618f576d1e471f1aa919d3cd7a21bf69ede9d53" exitCode=0 Apr 23 08:48:45.260893 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:45.260557 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerDied","Data":"21c7249e7213e1b7587b6cc02618f576d1e471f1aa919d3cd7a21bf69ede9d53"} Apr 23 08:48:46.265322 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:46.265135 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27dts" event={"ID":"a619dfec-0cfa-46a8-9c86-7b06735feeaa","Type":"ContainerStarted","Data":"d9bd29ab5b90d158e14275f2eb92fd3e3ef94f220b75e3a60e8d7de2fbb98535"} Apr 23 08:48:46.287266 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:46.287210 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-27dts" podStartSLOduration=5.838516328 podStartE2EDuration="37.287196496s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:11.720752621 +0000 UTC m=+3.237611950" lastFinishedPulling="2026-04-23 08:48:43.169432805 +0000 UTC m=+34.686292118" observedRunningTime="2026-04-23 08:48:46.285912312 +0000 UTC m=+37.802771644" watchObservedRunningTime="2026-04-23 08:48:46.287196496 +0000 UTC m=+37.804055833" Apr 23 08:48:46.618399 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:46.618365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:46.618547 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:46.618427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:46.618547 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:46.618500 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:46.618547 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:46.618507 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:46.618644 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:46.618556 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:50.618541266 +0000 UTC m=+42.135400583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:48:46.618644 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:46.618570 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:50.618564328 +0000 UTC m=+42.135423641 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:48:50.645446 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:50.645408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:50.645879 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:50.645459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:50.645879 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:50.645544 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:50.645879 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:50.645584 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:50.645879 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:50.645602 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:58.6455872 +0000 UTC m=+50.162446517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:48:50.645879 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:50.645635 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:58.64562005 +0000 UTC m=+50.162479387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:48:58.697137 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:58.697097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:48:58.697137 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:48:58.697144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:48:58.697739 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:58.697240 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:58.697739 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:58.697244 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:58.697739 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:58.697291 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:14.69727804 +0000 UTC m=+66.214137353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:48:58.697739 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:48:58.697304 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:14.697298455 +0000 UTC m=+66.214157767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:49:07.253407 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:07.253359 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9zxl" Apr 23 08:49:14.710079 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.710043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:49:14.710477 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.710101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:49:14.710477 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.710130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:49:14.710477 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:14.710191 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:49:14.710477 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:14.710213 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:49:14.710477 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:14.710270 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:46.710254458 +0000 UTC m=+98.227113776 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:49:14.710477 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:14.710284 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:46.710278505 +0000 UTC m=+98.227137818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:49:14.714655 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.714637 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:49:14.721028 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:14.721016 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:49:14.721085 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:14.721050 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:50:18.72103992 +0000 UTC m=+130.237899237 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : secret "metrics-daemon-secret" not found Apr 23 08:49:14.911474 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.911428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:49:14.914495 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.914477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:49:14.924898 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.924879 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:49:14.936059 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.936040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfwl\" (UniqueName: \"kubernetes.io/projected/96a803a2-c0b4-4975-b7d2-ca3157aa9f00-kube-api-access-qhfwl\") pod \"network-check-target-xksm7\" (UID: \"96a803a2-c0b4-4975-b7d2-ca3157aa9f00\") " pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:49:14.982960 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.982905 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h75n2\"" Apr 23 08:49:14.990365 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:14.990350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:49:15.114247 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:15.114218 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xksm7"] Apr 23 08:49:15.117683 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:49:15.117658 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a803a2_c0b4_4975_b7d2_ca3157aa9f00.slice/crio-a897cb2644fff663c29f99d22762fb2a1986df625681bbbaa333f1f513c858e6 WatchSource:0}: Error finding container a897cb2644fff663c29f99d22762fb2a1986df625681bbbaa333f1f513c858e6: Status 404 returned error can't find the container with id a897cb2644fff663c29f99d22762fb2a1986df625681bbbaa333f1f513c858e6 Apr 23 08:49:15.317710 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:15.317633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xksm7" event={"ID":"96a803a2-c0b4-4975-b7d2-ca3157aa9f00","Type":"ContainerStarted","Data":"a897cb2644fff663c29f99d22762fb2a1986df625681bbbaa333f1f513c858e6"} Apr 23 08:49:18.325037 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:18.325001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xksm7" event={"ID":"96a803a2-c0b4-4975-b7d2-ca3157aa9f00","Type":"ContainerStarted","Data":"4800cff8ffa54aa21f929f6b1ab3ff9d44ffbcf4ea71f6a66567c33d3167c99d"} Apr 23 08:49:18.325438 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:18.325161 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:49:18.342741 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:18.342681 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xksm7" podStartSLOduration=66.748783313 podStartE2EDuration="1m9.34267031s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:49:15.119368921 +0000 UTC m=+66.636228235" lastFinishedPulling="2026-04-23 08:49:17.713255915 +0000 UTC m=+69.230115232" observedRunningTime="2026-04-23 08:49:18.342158586 +0000 UTC m=+69.859017934" watchObservedRunningTime="2026-04-23 08:49:18.34267031 +0000 UTC m=+69.859529644" Apr 23 08:49:46.724680 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:46.724621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:49:46.725161 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:46.724699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:49:46.725161 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:46.724779 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:49:46.725161 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:46.724785 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:49:46.725161 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:46.724833 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert podName:85967c1f-fcc5-477d-949f-62d16a82fb18 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:50.724816933 +0000 UTC m=+162.241676246 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert") pod "ingress-canary-pnnnr" (UID: "85967c1f-fcc5-477d-949f-62d16a82fb18") : secret "canary-serving-cert" not found Apr 23 08:49:46.725161 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:49:46.724861 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls podName:67ac0c42-3257-496f-9fbd-98d7b980d7d5 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:50.724839945 +0000 UTC m=+162.241699262 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls") pod "dns-default-fxlql" (UID: "67ac0c42-3257-496f-9fbd-98d7b980d7d5") : secret "dns-default-metrics-tls" not found Apr 23 08:49:49.329315 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:49:49.329285 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xksm7" Apr 23 08:50:02.944585 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:02.944546 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt"] Apr 23 08:50:02.947151 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:02.947136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" Apr 23 08:50:02.949905 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:02.949874 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:02.949905 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:02.949891 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:02.950056 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:02.949955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-v9d26\"" Apr 23 08:50:02.956863 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:02.956840 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt"] Apr 23 08:50:03.027855 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.027811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqc65\" (UniqueName: \"kubernetes.io/projected/5ddb114e-9238-4d4e-9e66-f1b3f964c562-kube-api-access-kqc65\") pod \"volume-data-source-validator-7c6cbb6c87-9vxvt\" (UID: \"5ddb114e-9238-4d4e-9e66-f1b3f964c562\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" Apr 23 08:50:03.050226 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.050195 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wsgmm"] Apr 23 08:50:03.052843 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.052825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.056533 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.056513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-zrtth\"" Apr 23 08:50:03.057177 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.057164 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 08:50:03.057449 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.057435 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:50:03.057744 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.057726 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 08:50:03.057876 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.057758 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:50:03.059790 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.059768 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5bff4876d9-6ljch"] Apr 23 08:50:03.062379 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.062362 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.066342 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.066320 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:50:03.066342 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.066335 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:50:03.067600 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.067543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:50:03.067710 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.067597 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpxkg\"" Apr 23 08:50:03.070725 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.070705 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wsgmm"] Apr 23 08:50:03.071872 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.071243 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 08:50:03.074478 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.074457 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:50:03.077796 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.077779 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bff4876d9-6ljch"] Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqc65\" (UniqueName: \"kubernetes.io/projected/5ddb114e-9238-4d4e-9e66-f1b3f964c562-kube-api-access-kqc65\") pod \"volume-data-source-validator-7c6cbb6c87-9vxvt\" (UID: \"5ddb114e-9238-4d4e-9e66-f1b3f964c562\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f47ad8-406e-4c78-b8b6-7565d305e19d-ca-trust-extracted\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/976abe21-f40b-42ec-b745-de58f1628c36-serving-cert\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/976abe21-f40b-42ec-b745-de58f1628c36-tmp\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976abe21-f40b-42ec-b745-de58f1628c36-service-ca-bundle\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-trusted-ca\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.128999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-certificates\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.129023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-image-registry-private-configuration\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.129049 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-installation-pull-secrets\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.129071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-bound-sa-token\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.129093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v87d\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-kube-api-access-2v87d\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.129115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/976abe21-f40b-42ec-b745-de58f1628c36-snapshots\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.129139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtgm\" (UniqueName: \"kubernetes.io/projected/976abe21-f40b-42ec-b745-de58f1628c36-kube-api-access-rhtgm\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.134412 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.129166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976abe21-f40b-42ec-b745-de58f1628c36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.138571 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.138539 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqc65\" (UniqueName: \"kubernetes.io/projected/5ddb114e-9238-4d4e-9e66-f1b3f964c562-kube-api-access-kqc65\") pod \"volume-data-source-validator-7c6cbb6c87-9vxvt\" (UID: \"5ddb114e-9238-4d4e-9e66-f1b3f964c562\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" Apr 23 08:50:03.149712 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.149675 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296"] Apr 23 08:50:03.152563 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.152544 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6"] Apr 23 08:50:03.152723 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.152704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.154821 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.154804 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc"] Apr 23 08:50:03.154942 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.154927 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" Apr 23 08:50:03.155565 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.155548 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 08:50:03.155923 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.155909 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-t6728\"" Apr 23 08:50:03.156281 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.156265 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:03.156347 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.156281 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 08:50:03.156545 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.156532 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:03.157175 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.157162 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.157727 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.157714 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9sgfx\"" Apr 23 08:50:03.159788 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.159770 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 08:50:03.159856 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.159774 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:03.160088 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.160065 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-rlv9f\"" Apr 23 08:50:03.160164 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.160092 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:03.160164 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.160067 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 08:50:03.165583 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.165563 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296"] Apr 23 08:50:03.166302 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.166275 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6"] Apr 23 08:50:03.167286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.167262 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc"] Apr 23 08:50:03.230496 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/976abe21-f40b-42ec-b745-de58f1628c36-snapshots\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.230496 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230449 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtgm\" (UniqueName: \"kubernetes.io/projected/976abe21-f40b-42ec-b745-de58f1628c36-kube-api-access-rhtgm\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.230496 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976abe21-f40b-42ec-b745-de58f1628c36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.230766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-config\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.230766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230537 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68blf\" (UniqueName: \"kubernetes.io/projected/7e40391b-c30d-4041-b6cf-e4b9e590a499-kube-api-access-68blf\") pod \"network-check-source-8894fc9bd-wttb6\" (UID: \"7e40391b-c30d-4041-b6cf-e4b9e590a499\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" Apr 23 08:50:03.230766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230566 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f47ad8-406e-4c78-b8b6-7565d305e19d-ca-trust-extracted\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.230766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/976abe21-f40b-42ec-b745-de58f1628c36-serving-cert\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.230766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba944cc-6425-4433-beef-e901be182078-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.230766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ds8\" (UniqueName: \"kubernetes.io/projected/eba944cc-6425-4433-beef-e901be182078-kube-api-access-j2ds8\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.230766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:03.230814 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:03.230828 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bff4876d9-6ljch: secret "image-registry-tls" not found Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkw85\" (UniqueName: \"kubernetes.io/projected/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-kube-api-access-wkw85\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:03.230962 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls podName:92f47ad8-406e-4c78-b8b6-7565d305e19d nodeName:}" failed. No retries permitted until 2026-04-23 08:50:03.730939616 +0000 UTC m=+115.247798931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls") pod "image-registry-5bff4876d9-6ljch" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d") : secret "image-registry-tls" not found Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f47ad8-406e-4c78-b8b6-7565d305e19d-ca-trust-extracted\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.230986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/976abe21-f40b-42ec-b745-de58f1628c36-tmp\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976abe21-f40b-42ec-b745-de58f1628c36-service-ca-bundle\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231052 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-trusted-ca\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eba944cc-6425-4433-beef-e901be182078-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.231606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/976abe21-f40b-42ec-b745-de58f1628c36-snapshots\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.231606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-certificates\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-image-registry-private-configuration\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-installation-pull-secrets\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-bound-sa-token\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v87d\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-kube-api-access-2v87d\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.231606 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/976abe21-f40b-42ec-b745-de58f1628c36-tmp\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.231925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976abe21-f40b-42ec-b745-de58f1628c36-service-ca-bundle\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.231925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231734 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/976abe21-f40b-42ec-b745-de58f1628c36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.231925 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.231838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-certificates\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.232045 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.232035 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-trusted-ca\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.233670 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.233649 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/976abe21-f40b-42ec-b745-de58f1628c36-serving-cert\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.233797 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.233767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-installation-pull-secrets\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.233937 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.233921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-image-registry-private-configuration\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.240425 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.240380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtgm\" (UniqueName: \"kubernetes.io/projected/976abe21-f40b-42ec-b745-de58f1628c36-kube-api-access-rhtgm\") pod \"insights-operator-585dfdc468-wsgmm\" (UID: \"976abe21-f40b-42ec-b745-de58f1628c36\") " pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.240991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.240975 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v87d\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-kube-api-access-2v87d\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.241279 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.241259 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-bound-sa-token\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.255100 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.255079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" Apr 23 08:50:03.332000 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.331925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-config\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.332000 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.331971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68blf\" (UniqueName: \"kubernetes.io/projected/7e40391b-c30d-4041-b6cf-e4b9e590a499-kube-api-access-68blf\") pod \"network-check-source-8894fc9bd-wttb6\" (UID: \"7e40391b-c30d-4041-b6cf-e4b9e590a499\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" Apr 23 08:50:03.332000 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.332002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba944cc-6425-4433-beef-e901be182078-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.332225 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.332032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ds8\" (UniqueName: \"kubernetes.io/projected/eba944cc-6425-4433-beef-e901be182078-kube-api-access-j2ds8\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.332225 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.332077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.332225 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.332104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkw85\" (UniqueName: \"kubernetes.io/projected/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-kube-api-access-wkw85\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.332225 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.332165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eba944cc-6425-4433-beef-e901be182078-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.332542 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.332438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-config\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.332866 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.332844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba944cc-6425-4433-beef-e901be182078-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.334577 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.334559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.334880 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.334860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eba944cc-6425-4433-beef-e901be182078-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.342979 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.342959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68blf\" (UniqueName: \"kubernetes.io/projected/7e40391b-c30d-4041-b6cf-e4b9e590a499-kube-api-access-68blf\") pod \"network-check-source-8894fc9bd-wttb6\" (UID: \"7e40391b-c30d-4041-b6cf-e4b9e590a499\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" Apr 23 08:50:03.344581 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.344561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkw85\" (UniqueName: \"kubernetes.io/projected/6ef9bf61-ead5-4bec-bd04-a167a6f7321f-kube-api-access-wkw85\") pod \"service-ca-operator-d6fc45fc5-cvwjc\" (UID: \"6ef9bf61-ead5-4bec-bd04-a167a6f7321f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.344830 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.344812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ds8\" (UniqueName: \"kubernetes.io/projected/eba944cc-6425-4433-beef-e901be182078-kube-api-access-j2ds8\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g296\" (UID: \"eba944cc-6425-4433-beef-e901be182078\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.362763 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.362733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wsgmm" Apr 23 08:50:03.367997 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.367971 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt"] Apr 23 08:50:03.370905 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:03.370873 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ddb114e_9238_4d4e_9e66_f1b3f964c562.slice/crio-632b8e28bb3183fab64b56f6dee58173f97027fe2b9e037e267e87728c7362a9 WatchSource:0}: Error finding container 632b8e28bb3183fab64b56f6dee58173f97027fe2b9e037e267e87728c7362a9: Status 404 returned error can't find the container with id 632b8e28bb3183fab64b56f6dee58173f97027fe2b9e037e267e87728c7362a9 Apr 23 08:50:03.411584 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.411545 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" event={"ID":"5ddb114e-9238-4d4e-9e66-f1b3f964c562","Type":"ContainerStarted","Data":"632b8e28bb3183fab64b56f6dee58173f97027fe2b9e037e267e87728c7362a9"} Apr 23 08:50:03.462673 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.462646 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" Apr 23 08:50:03.470415 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.470375 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" Apr 23 08:50:03.475079 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.475044 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" Apr 23 08:50:03.480551 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.480496 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wsgmm"] Apr 23 08:50:03.483866 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:03.483773 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod976abe21_f40b_42ec_b745_de58f1628c36.slice/crio-82e8a60f2a90f8c0de77802509e6b3f5d1215b746e4aa0f5a4dd848a52c9aa58 WatchSource:0}: Error finding container 82e8a60f2a90f8c0de77802509e6b3f5d1215b746e4aa0f5a4dd848a52c9aa58: Status 404 returned error can't find the container with id 82e8a60f2a90f8c0de77802509e6b3f5d1215b746e4aa0f5a4dd848a52c9aa58 Apr 23 08:50:03.602027 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.602002 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296"] Apr 23 08:50:03.604882 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:03.604856 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeba944cc_6425_4433_beef_e901be182078.slice/crio-23fe37ae3788aade4b685acf47ecf187a17897255b83d57f81852b193dce7055 WatchSource:0}: Error finding container 23fe37ae3788aade4b685acf47ecf187a17897255b83d57f81852b193dce7055: Status 404 returned error can't find the container with id 23fe37ae3788aade4b685acf47ecf187a17897255b83d57f81852b193dce7055 Apr 23 08:50:03.736144 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.736060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:03.736288 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:03.736206 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:50:03.736288 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:03.736226 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bff4876d9-6ljch: secret "image-registry-tls" not found Apr 23 08:50:03.736288 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:03.736283 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls podName:92f47ad8-406e-4c78-b8b6-7565d305e19d nodeName:}" failed. No retries permitted until 2026-04-23 08:50:04.736267988 +0000 UTC m=+116.253127305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls") pod "image-registry-5bff4876d9-6ljch" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d") : secret "image-registry-tls" not found Apr 23 08:50:03.817855 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.817685 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6"] Apr 23 08:50:03.818491 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:03.818468 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc"] Apr 23 08:50:03.820358 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:03.820331 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e40391b_c30d_4041_b6cf_e4b9e590a499.slice/crio-f0f6d76500ad32def9aee2c5ac68eac005ff96987788ffe8cf6a505f60932520 WatchSource:0}: Error finding container f0f6d76500ad32def9aee2c5ac68eac005ff96987788ffe8cf6a505f60932520: Status 404 returned error can't find the container with id f0f6d76500ad32def9aee2c5ac68eac005ff96987788ffe8cf6a505f60932520 Apr 23 08:50:03.821253 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:03.821240 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef9bf61_ead5_4bec_bd04_a167a6f7321f.slice/crio-56e499e9d19437293f084e93d7e940d875f2a4c5d51d5952c423bc88c90036c0 WatchSource:0}: Error finding container 56e499e9d19437293f084e93d7e940d875f2a4c5d51d5952c423bc88c90036c0: Status 404 returned error can't find the container with id 56e499e9d19437293f084e93d7e940d875f2a4c5d51d5952c423bc88c90036c0 Apr 23 08:50:04.416985 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:04.416903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" event={"ID":"6ef9bf61-ead5-4bec-bd04-a167a6f7321f","Type":"ContainerStarted","Data":"56e499e9d19437293f084e93d7e940d875f2a4c5d51d5952c423bc88c90036c0"} Apr 23 08:50:04.420908 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:04.420156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" event={"ID":"7e40391b-c30d-4041-b6cf-e4b9e590a499","Type":"ContainerStarted","Data":"b922ae33fc490eea71ac8818d4cba70c5a44706b9a80d710def2468cf6a9dac2"} Apr 23 08:50:04.420908 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:04.420198 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" event={"ID":"7e40391b-c30d-4041-b6cf-e4b9e590a499","Type":"ContainerStarted","Data":"f0f6d76500ad32def9aee2c5ac68eac005ff96987788ffe8cf6a505f60932520"} Apr 23 08:50:04.422080 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:04.422056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wsgmm" event={"ID":"976abe21-f40b-42ec-b745-de58f1628c36","Type":"ContainerStarted","Data":"82e8a60f2a90f8c0de77802509e6b3f5d1215b746e4aa0f5a4dd848a52c9aa58"} Apr 23 08:50:04.425499 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:04.425370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" event={"ID":"eba944cc-6425-4433-beef-e901be182078","Type":"ContainerStarted","Data":"23fe37ae3788aade4b685acf47ecf187a17897255b83d57f81852b193dce7055"} Apr 23 08:50:04.745325 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:04.745177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:04.745505 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:04.745339 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:50:04.745505 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:04.745362 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bff4876d9-6ljch: secret "image-registry-tls" not found Apr 23 08:50:04.745505 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:04.745447 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls podName:92f47ad8-406e-4c78-b8b6-7565d305e19d nodeName:}" failed. No retries permitted until 2026-04-23 08:50:06.745427709 +0000 UTC m=+118.262287037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls") pod "image-registry-5bff4876d9-6ljch" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d") : secret "image-registry-tls" not found Apr 23 08:50:05.431325 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:05.431225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" event={"ID":"5ddb114e-9238-4d4e-9e66-f1b3f964c562","Type":"ContainerStarted","Data":"6cb7553311d239d45729fd305228fe75216bdc273cba8e59e7657cb1041b585a"} Apr 23 08:50:05.447746 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:05.447664 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wttb6" podStartSLOduration=2.447644912 podStartE2EDuration="2.447644912s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:04.440136072 +0000 UTC m=+115.956995403" watchObservedRunningTime="2026-04-23 08:50:05.447644912 +0000 UTC m=+116.964504248" Apr 23 08:50:05.448851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:05.448808 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9vxvt" podStartSLOduration=1.6467298989999999 podStartE2EDuration="3.448795796s" podCreationTimestamp="2026-04-23 08:50:02 +0000 UTC" firstStartedPulling="2026-04-23 08:50:03.3726822 +0000 UTC m=+114.889541513" lastFinishedPulling="2026-04-23 08:50:05.174748085 +0000 UTC m=+116.691607410" observedRunningTime="2026-04-23 08:50:05.447243104 +0000 UTC m=+116.964102440" watchObservedRunningTime="2026-04-23 08:50:05.448795796 +0000 UTC m=+116.965655133" Apr 23 08:50:06.763134 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:06.763095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:06.763556 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:06.763296 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:50:06.763556 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:06.763316 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bff4876d9-6ljch: secret "image-registry-tls" not found Apr 23 08:50:06.763556 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:06.763373 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls podName:92f47ad8-406e-4c78-b8b6-7565d305e19d nodeName:}" failed. No retries permitted until 2026-04-23 08:50:10.763357179 +0000 UTC m=+122.280216493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls") pod "image-registry-5bff4876d9-6ljch" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d") : secret "image-registry-tls" not found Apr 23 08:50:07.440104 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:07.436792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" event={"ID":"6ef9bf61-ead5-4bec-bd04-a167a6f7321f","Type":"ContainerStarted","Data":"9c4c05fa948b133540ac81b92d64ceaa3d87d3a79d7e70b2a812d42739919165"} Apr 23 08:50:07.440104 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:07.438313 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wsgmm" event={"ID":"976abe21-f40b-42ec-b745-de58f1628c36","Type":"ContainerStarted","Data":"9cf952471261a79943d35e725d083f472a3e43b2b9f656e4c550d183729e380e"} Apr 23 08:50:07.441993 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:07.441970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" event={"ID":"eba944cc-6425-4433-beef-e901be182078","Type":"ContainerStarted","Data":"11a08b618450dc00465ed70d161bdc1662aeb1e9af6c5b3e778e4ea63313fba0"} Apr 23 08:50:07.455139 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:07.455090 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" podStartSLOduration=1.408306861 podStartE2EDuration="4.45507701s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="2026-04-23 08:50:03.823092757 +0000 UTC m=+115.339952070" lastFinishedPulling="2026-04-23 08:50:06.869862902 +0000 UTC m=+118.386722219" observedRunningTime="2026-04-23 08:50:07.454094263 +0000 UTC m=+118.970953626" watchObservedRunningTime="2026-04-23 08:50:07.45507701 +0000 UTC m=+118.971936346" Apr 23 08:50:07.473300 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:07.473259 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-wsgmm" podStartSLOduration=1.095113178 podStartE2EDuration="4.473247417s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="2026-04-23 08:50:03.486079822 +0000 UTC m=+115.002939138" lastFinishedPulling="2026-04-23 08:50:06.86421406 +0000 UTC m=+118.381073377" observedRunningTime="2026-04-23 08:50:07.472515587 +0000 UTC m=+118.989374923" watchObservedRunningTime="2026-04-23 08:50:07.473247417 +0000 UTC m=+118.990106943" Apr 23 08:50:07.492874 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:07.492829 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" podStartSLOduration=1.231885133 podStartE2EDuration="4.492817122s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="2026-04-23 08:50:03.606626472 +0000 UTC m=+115.123485785" lastFinishedPulling="2026-04-23 08:50:06.867558444 +0000 UTC m=+118.384417774" observedRunningTime="2026-04-23 08:50:07.492130591 +0000 UTC m=+119.008989930" watchObservedRunningTime="2026-04-23 08:50:07.492817122 +0000 UTC m=+119.009676456" Apr 23 08:50:10.602006 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.601974 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nvnmg"] Apr 23 08:50:10.606283 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.606263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.609221 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.609197 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 08:50:10.610633 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.610612 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 08:50:10.610633 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.610625 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 08:50:10.610781 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.610622 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 08:50:10.610781 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.610675 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bh5tt\"" Apr 23 08:50:10.616627 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.616606 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nvnmg"] Apr 23 08:50:10.692716 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.692681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61032a47-c259-42d4-9698-cb6a746a73f0-signing-cabundle\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.692875 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.692761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61032a47-c259-42d4-9698-cb6a746a73f0-signing-key\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.692875 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.692788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmq44\" (UniqueName: \"kubernetes.io/projected/61032a47-c259-42d4-9698-cb6a746a73f0-kube-api-access-xmq44\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.793800 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.793765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61032a47-c259-42d4-9698-cb6a746a73f0-signing-key\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.793947 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.793805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmq44\" (UniqueName: \"kubernetes.io/projected/61032a47-c259-42d4-9698-cb6a746a73f0-kube-api-access-xmq44\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.793947 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.793870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61032a47-c259-42d4-9698-cb6a746a73f0-signing-cabundle\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.793947 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.793898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:10.794075 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:10.793989 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:50:10.794075 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:10.794004 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bff4876d9-6ljch: secret "image-registry-tls" not found Apr 23 08:50:10.794075 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:10.794073 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls podName:92f47ad8-406e-4c78-b8b6-7565d305e19d nodeName:}" failed. No retries permitted until 2026-04-23 08:50:18.79405306 +0000 UTC m=+130.310912383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls") pod "image-registry-5bff4876d9-6ljch" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d") : secret "image-registry-tls" not found Apr 23 08:50:10.794607 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.794586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61032a47-c259-42d4-9698-cb6a746a73f0-signing-cabundle\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.796318 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.796297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61032a47-c259-42d4-9698-cb6a746a73f0-signing-key\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.806760 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.806729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmq44\" (UniqueName: \"kubernetes.io/projected/61032a47-c259-42d4-9698-cb6a746a73f0-kube-api-access-xmq44\") pod \"service-ca-865cb79987-nvnmg\" (UID: \"61032a47-c259-42d4-9698-cb6a746a73f0\") " pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:10.903562 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.903480 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ffdbq_e556abe8-644b-4251-99a6-a109bfc8c173/dns-node-resolver/0.log" Apr 23 08:50:10.914780 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:10.914756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nvnmg" Apr 23 08:50:11.032826 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:11.032793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nvnmg"] Apr 23 08:50:11.036174 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:11.036144 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61032a47_c259_42d4_9698_cb6a746a73f0.slice/crio-3d3de51e280f8449c81938b38d81a6827928fa0cfee0a9a5af79d8fa65a0ea8f WatchSource:0}: Error finding container 3d3de51e280f8449c81938b38d81a6827928fa0cfee0a9a5af79d8fa65a0ea8f: Status 404 returned error can't find the container with id 3d3de51e280f8449c81938b38d81a6827928fa0cfee0a9a5af79d8fa65a0ea8f Apr 23 08:50:11.454818 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:11.454785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nvnmg" event={"ID":"61032a47-c259-42d4-9698-cb6a746a73f0","Type":"ContainerStarted","Data":"098cafae16e0b08b2eb84d0fee128993e4f359389ea70d56a3fa8b8dadd8b4a4"} Apr 23 08:50:11.454818 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:11.454818 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nvnmg" event={"ID":"61032a47-c259-42d4-9698-cb6a746a73f0","Type":"ContainerStarted","Data":"3d3de51e280f8449c81938b38d81a6827928fa0cfee0a9a5af79d8fa65a0ea8f"} Apr 23 08:50:11.473940 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:11.473899 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-nvnmg" podStartSLOduration=1.473885706 podStartE2EDuration="1.473885706s" podCreationTimestamp="2026-04-23 08:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:11.473140973 +0000 UTC m=+122.990000311" watchObservedRunningTime="2026-04-23 08:50:11.473885706 +0000 UTC m=+122.990745041" Apr 23 08:50:11.703137 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:11.703110 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-px7gw_3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0/node-ca/0.log" Apr 23 08:50:13.305506 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:13.305480 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4g296_eba944cc-6425-4433-beef-e901be182078/kube-storage-version-migrator-operator/0.log" Apr 23 08:50:18.759358 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:18.759316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:50:18.759960 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:18.759473 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:50:18.759960 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:18.759550 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs podName:b87e8c66-90ff-454c-9c82-3fe28797e8df nodeName:}" failed. No retries permitted until 2026-04-23 08:52:20.759532959 +0000 UTC m=+252.276392271 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs") pod "network-metrics-daemon-wnmff" (UID: "b87e8c66-90ff-454c-9c82-3fe28797e8df") : secret "metrics-daemon-secret" not found Apr 23 08:50:18.860343 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:18.860309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:18.862798 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:18.862771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"image-registry-5bff4876d9-6ljch\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:18.976014 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:18.975990 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpxkg\"" Apr 23 08:50:18.983204 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:18.983186 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:19.110572 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:19.110542 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bff4876d9-6ljch"] Apr 23 08:50:19.113958 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:19.113929 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92f47ad8_406e_4c78_b8b6_7565d305e19d.slice/crio-082b348038db114e5d61c9ae1e9da2c4eb2444bfd159eab49552411521f6411a WatchSource:0}: Error finding container 082b348038db114e5d61c9ae1e9da2c4eb2444bfd159eab49552411521f6411a: Status 404 returned error can't find the container with id 082b348038db114e5d61c9ae1e9da2c4eb2444bfd159eab49552411521f6411a Apr 23 08:50:19.475612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:19.475564 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" event={"ID":"92f47ad8-406e-4c78-b8b6-7565d305e19d","Type":"ContainerStarted","Data":"c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423"} Apr 23 08:50:19.475612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:19.475615 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" event={"ID":"92f47ad8-406e-4c78-b8b6-7565d305e19d","Type":"ContainerStarted","Data":"082b348038db114e5d61c9ae1e9da2c4eb2444bfd159eab49552411521f6411a"} Apr 23 08:50:19.475845 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:19.475649 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:19.496246 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:19.496201 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" podStartSLOduration=16.496187907 podStartE2EDuration="16.496187907s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:19.495323008 +0000 UTC m=+131.012182342" watchObservedRunningTime="2026-04-23 08:50:19.496187907 +0000 UTC m=+131.013047270" Apr 23 08:50:33.902999 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.902961 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-g996f"] Apr 23 08:50:33.907564 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.907544 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:33.909169 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.909151 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bff4876d9-6ljch"] Apr 23 08:50:33.910264 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.910248 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wnmgb\"" Apr 23 08:50:33.910560 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.910541 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:50:33.910724 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.910711 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:50:33.918517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.918490 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g996f"] Apr 23 08:50:33.981323 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.981293 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb"] Apr 23 08:50:33.984653 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.984635 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9949b69cb-2t8pg"] Apr 23 08:50:33.984798 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.984778 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:33.987862 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.987845 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:33.988448 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.988431 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-x54vs\"" Apr 23 08:50:33.988768 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.988748 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 08:50:33.988866 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.988822 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 08:50:33.993160 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.993146 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:50:33.993372 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.993355 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:50:33.993661 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.993640 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:50:33.993814 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.993663 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:50:33.993814 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.993668 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9x486\"" Apr 23 08:50:33.993814 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.993668 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:50:33.994050 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:33.994034 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:50:34.000188 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.000169 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:50:34.000277 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.000256 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9949b69cb-2t8pg"] Apr 23 08:50:34.049269 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.049243 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb"] Apr 23 08:50:34.068612 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.068588 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b6s4\" (UniqueName: \"kubernetes.io/projected/d223acb1-cce5-46a5-9460-64ee56b66cf1-kube-api-access-7b6s4\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.068711 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.068616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d223acb1-cce5-46a5-9460-64ee56b66cf1-data-volume\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.068711 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.068646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/601b91da-cc29-4ccb-a0f3-c184e435eef8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pzrvb\" (UID: \"601b91da-cc29-4ccb-a0f3-c184e435eef8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:34.068711 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.068696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/601b91da-cc29-4ccb-a0f3-c184e435eef8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pzrvb\" (UID: \"601b91da-cc29-4ccb-a0f3-c184e435eef8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:34.068811 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.068729 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d223acb1-cce5-46a5-9460-64ee56b66cf1-crio-socket\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.068811 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.068750 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d223acb1-cce5-46a5-9460-64ee56b66cf1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.068811 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.068780 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d223acb1-cce5-46a5-9460-64ee56b66cf1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.169966 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.169902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-service-ca\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.169966 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.169952 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b6s4\" (UniqueName: \"kubernetes.io/projected/d223acb1-cce5-46a5-9460-64ee56b66cf1-kube-api-access-7b6s4\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.169997 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d223acb1-cce5-46a5-9460-64ee56b66cf1-data-volume\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhjq2\" (UniqueName: \"kubernetes.io/projected/9fbfde04-c427-426f-95d3-93915990fa2a-kube-api-access-hhjq2\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.170110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/601b91da-cc29-4ccb-a0f3-c184e435eef8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pzrvb\" (UID: \"601b91da-cc29-4ccb-a0f3-c184e435eef8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:34.170110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/601b91da-cc29-4ccb-a0f3-c184e435eef8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pzrvb\" (UID: \"601b91da-cc29-4ccb-a0f3-c184e435eef8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:34.170110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d223acb1-cce5-46a5-9460-64ee56b66cf1-crio-socket\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170123 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d223acb1-cce5-46a5-9460-64ee56b66cf1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-serving-cert\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.170286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-oauth-config\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.170286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-console-config\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.170286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170217 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d223acb1-cce5-46a5-9460-64ee56b66cf1-crio-socket\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d223acb1-cce5-46a5-9460-64ee56b66cf1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170523 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170287 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-oauth-serving-cert\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.170523 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170364 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d223acb1-cce5-46a5-9460-64ee56b66cf1-data-volume\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170762 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d223acb1-cce5-46a5-9460-64ee56b66cf1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.170863 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.170811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/601b91da-cc29-4ccb-a0f3-c184e435eef8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pzrvb\" (UID: \"601b91da-cc29-4ccb-a0f3-c184e435eef8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:34.172877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.172848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/601b91da-cc29-4ccb-a0f3-c184e435eef8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pzrvb\" (UID: \"601b91da-cc29-4ccb-a0f3-c184e435eef8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:34.173010 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.172994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d223acb1-cce5-46a5-9460-64ee56b66cf1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.180116 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.180096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b6s4\" (UniqueName: \"kubernetes.io/projected/d223acb1-cce5-46a5-9460-64ee56b66cf1-kube-api-access-7b6s4\") pod \"insights-runtime-extractor-g996f\" (UID: \"d223acb1-cce5-46a5-9460-64ee56b66cf1\") " pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.219856 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.219839 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g996f" Apr 23 08:50:34.271081 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-serving-cert\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.271212 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-oauth-config\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.271212 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-console-config\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.271212 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-oauth-serving-cert\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.271212 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-service-ca\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.271465 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhjq2\" (UniqueName: \"kubernetes.io/projected/9fbfde04-c427-426f-95d3-93915990fa2a-kube-api-access-hhjq2\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.271865 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-console-config\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.272001 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.271974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-oauth-serving-cert\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.272257 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.272216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-service-ca\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.274029 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.274007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-serving-cert\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.274856 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.274833 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-oauth-config\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.279587 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.279543 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhjq2\" (UniqueName: \"kubernetes.io/projected/9fbfde04-c427-426f-95d3-93915990fa2a-kube-api-access-hhjq2\") pod \"console-9949b69cb-2t8pg\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.294668 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.294646 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" Apr 23 08:50:34.300819 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.300458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:34.341127 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.340968 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g996f"] Apr 23 08:50:34.344433 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:34.344383 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd223acb1_cce5_46a5_9460_64ee56b66cf1.slice/crio-b0ce8620a0243e4c66fb7dc1743400a0f2a1ed84f6cdf0c3677e485cabb05e76 WatchSource:0}: Error finding container b0ce8620a0243e4c66fb7dc1743400a0f2a1ed84f6cdf0c3677e485cabb05e76: Status 404 returned error can't find the container with id b0ce8620a0243e4c66fb7dc1743400a0f2a1ed84f6cdf0c3677e485cabb05e76 Apr 23 08:50:34.432911 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.432887 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb"] Apr 23 08:50:34.436096 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:34.436072 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod601b91da_cc29_4ccb_a0f3_c184e435eef8.slice/crio-5941c600d82fa40578764ea348c2c74d999a1a6e23eb2570a4889d9cda9168c8 WatchSource:0}: Error finding container 5941c600d82fa40578764ea348c2c74d999a1a6e23eb2570a4889d9cda9168c8: Status 404 returned error can't find the container with id 5941c600d82fa40578764ea348c2c74d999a1a6e23eb2570a4889d9cda9168c8 Apr 23 08:50:34.453478 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.453458 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9949b69cb-2t8pg"] Apr 23 08:50:34.456238 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:34.456201 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fbfde04_c427_426f_95d3_93915990fa2a.slice/crio-e2bf01d201db10b791d691c1232106c9bc199ae211f6782c539d0c62f93e790e WatchSource:0}: Error finding container e2bf01d201db10b791d691c1232106c9bc199ae211f6782c539d0c62f93e790e: Status 404 returned error can't find the container with id e2bf01d201db10b791d691c1232106c9bc199ae211f6782c539d0c62f93e790e Apr 23 08:50:34.511249 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.511217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9949b69cb-2t8pg" event={"ID":"9fbfde04-c427-426f-95d3-93915990fa2a","Type":"ContainerStarted","Data":"e2bf01d201db10b791d691c1232106c9bc199ae211f6782c539d0c62f93e790e"} Apr 23 08:50:34.512148 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.512127 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" event={"ID":"601b91da-cc29-4ccb-a0f3-c184e435eef8","Type":"ContainerStarted","Data":"5941c600d82fa40578764ea348c2c74d999a1a6e23eb2570a4889d9cda9168c8"} Apr 23 08:50:34.513228 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.513206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g996f" event={"ID":"d223acb1-cce5-46a5-9460-64ee56b66cf1","Type":"ContainerStarted","Data":"7fbcf9f91a5c9c585e657232ae7a66bf98357395b3091e8408a63308cabafad9"} Apr 23 08:50:34.513324 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:34.513235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g996f" event={"ID":"d223acb1-cce5-46a5-9460-64ee56b66cf1","Type":"ContainerStarted","Data":"b0ce8620a0243e4c66fb7dc1743400a0f2a1ed84f6cdf0c3677e485cabb05e76"} Apr 23 08:50:35.517969 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:35.517881 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g996f" event={"ID":"d223acb1-cce5-46a5-9460-64ee56b66cf1","Type":"ContainerStarted","Data":"398ac258f2b157ef4821d77d0feecc6a3db8dad8fd6cd642b86da8ae04e1ee94"} Apr 23 08:50:36.523436 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:36.523366 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" event={"ID":"601b91da-cc29-4ccb-a0f3-c184e435eef8","Type":"ContainerStarted","Data":"e8e6109375a36b9446ce14703d8c838e6c07a50f0e61b6daef6589d5a63e03e1"} Apr 23 08:50:36.541904 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:36.541847 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pzrvb" podStartSLOduration=2.355975147 podStartE2EDuration="3.541829623s" podCreationTimestamp="2026-04-23 08:50:33 +0000 UTC" firstStartedPulling="2026-04-23 08:50:34.437716119 +0000 UTC m=+145.954575432" lastFinishedPulling="2026-04-23 08:50:35.623570578 +0000 UTC m=+147.140429908" observedRunningTime="2026-04-23 08:50:36.541056016 +0000 UTC m=+148.057915352" watchObservedRunningTime="2026-04-23 08:50:36.541829623 +0000 UTC m=+148.058688959" Apr 23 08:50:38.053025 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.052997 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h"] Apr 23 08:50:38.055985 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.055969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" Apr 23 08:50:38.058788 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.058768 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 08:50:38.058869 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.058780 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-wrlxd\"" Apr 23 08:50:38.064210 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.064185 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h"] Apr 23 08:50:38.207676 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.207587 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/55fef949-2a3f-4c26-aa87-1598dd40e0f4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-r5k8h\" (UID: \"55fef949-2a3f-4c26-aa87-1598dd40e0f4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" Apr 23 08:50:38.308930 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.308901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/55fef949-2a3f-4c26-aa87-1598dd40e0f4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-r5k8h\" (UID: \"55fef949-2a3f-4c26-aa87-1598dd40e0f4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" Apr 23 08:50:38.311513 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.311497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/55fef949-2a3f-4c26-aa87-1598dd40e0f4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-r5k8h\" (UID: \"55fef949-2a3f-4c26-aa87-1598dd40e0f4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" Apr 23 08:50:38.365498 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.365466 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" Apr 23 08:50:38.480811 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.480654 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h"] Apr 23 08:50:38.483099 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:38.483075 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55fef949_2a3f_4c26_aa87_1598dd40e0f4.slice/crio-6cdbb60d1e5dd180e3f03e00a7200fbb1cb75e24d7d13728ec168cd71ae7b130 WatchSource:0}: Error finding container 6cdbb60d1e5dd180e3f03e00a7200fbb1cb75e24d7d13728ec168cd71ae7b130: Status 404 returned error can't find the container with id 6cdbb60d1e5dd180e3f03e00a7200fbb1cb75e24d7d13728ec168cd71ae7b130 Apr 23 08:50:38.529615 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.529586 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9949b69cb-2t8pg" event={"ID":"9fbfde04-c427-426f-95d3-93915990fa2a","Type":"ContainerStarted","Data":"0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b"} Apr 23 08:50:38.530595 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.530575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" event={"ID":"55fef949-2a3f-4c26-aa87-1598dd40e0f4","Type":"ContainerStarted","Data":"6cdbb60d1e5dd180e3f03e00a7200fbb1cb75e24d7d13728ec168cd71ae7b130"} Apr 23 08:50:38.532172 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.532151 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g996f" event={"ID":"d223acb1-cce5-46a5-9460-64ee56b66cf1","Type":"ContainerStarted","Data":"76c26860eaceaab655337c66cc9807d817657f4defc649cde0bff0ad41c95e29"} Apr 23 08:50:38.550840 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.550799 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9949b69cb-2t8pg" podStartSLOduration=2.177733783 podStartE2EDuration="5.550788362s" podCreationTimestamp="2026-04-23 08:50:33 +0000 UTC" firstStartedPulling="2026-04-23 08:50:34.458030652 +0000 UTC m=+145.974889968" lastFinishedPulling="2026-04-23 08:50:37.831085233 +0000 UTC m=+149.347944547" observedRunningTime="2026-04-23 08:50:38.549432999 +0000 UTC m=+150.066292393" watchObservedRunningTime="2026-04-23 08:50:38.550788362 +0000 UTC m=+150.067647696" Apr 23 08:50:38.567088 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:38.567040 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-g996f" podStartSLOduration=2.159533638 podStartE2EDuration="5.567031851s" podCreationTimestamp="2026-04-23 08:50:33 +0000 UTC" firstStartedPulling="2026-04-23 08:50:34.418447573 +0000 UTC m=+145.935306894" lastFinishedPulling="2026-04-23 08:50:37.825945795 +0000 UTC m=+149.342805107" observedRunningTime="2026-04-23 08:50:38.56631549 +0000 UTC m=+150.083174827" watchObservedRunningTime="2026-04-23 08:50:38.567031851 +0000 UTC m=+150.083891185" Apr 23 08:50:40.538215 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:40.538171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" event={"ID":"55fef949-2a3f-4c26-aa87-1598dd40e0f4","Type":"ContainerStarted","Data":"25053d8c9cc9041499edb0e1c5f5ad7daae558ed40080a4b8e7323b4db9c2c4c"} Apr 23 08:50:40.538584 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:40.538381 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" Apr 23 08:50:40.542667 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:40.542647 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" Apr 23 08:50:40.556062 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:40.556026 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-r5k8h" podStartSLOduration=1.45494544 podStartE2EDuration="2.556017244s" podCreationTimestamp="2026-04-23 08:50:38 +0000 UTC" firstStartedPulling="2026-04-23 08:50:38.486653347 +0000 UTC m=+150.003512660" lastFinishedPulling="2026-04-23 08:50:39.587725148 +0000 UTC m=+151.104584464" observedRunningTime="2026-04-23 08:50:40.555213798 +0000 UTC m=+152.072073132" watchObservedRunningTime="2026-04-23 08:50:40.556017244 +0000 UTC m=+152.072876580" Apr 23 08:50:42.216848 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.216818 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56bd7886c-mnnsv"] Apr 23 08:50:42.220039 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.220023 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.229687 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.229663 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 08:50:42.230307 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.230287 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56bd7886c-mnnsv"] Apr 23 08:50:42.237285 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.237263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5v8\" (UniqueName: \"kubernetes.io/projected/d2cb1832-d60b-4124-bae2-26882e380037-kube-api-access-ck5v8\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.237413 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.237311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-serving-cert\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.237413 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.237340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-service-ca\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.237413 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.237364 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-trusted-ca-bundle\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.237413 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.237410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-console-config\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.237597 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.237437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-oauth-serving-cert\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.237597 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.237463 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-oauth-config\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.338333 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.338299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-oauth-serving-cert\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.338333 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.338338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-oauth-config\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.338523 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.338502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5v8\" (UniqueName: \"kubernetes.io/projected/d2cb1832-d60b-4124-bae2-26882e380037-kube-api-access-ck5v8\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.338561 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.338543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-serving-cert\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.338594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.338561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-service-ca\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.338713 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.338694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-trusted-ca-bundle\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.338792 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.338744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-console-config\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.339128 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.339104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-oauth-serving-cert\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.339219 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.339177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-service-ca\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.340002 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.339978 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-trusted-ca-bundle\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.340666 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.340646 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-console-config\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.341099 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.341071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-oauth-config\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.341173 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.341109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-serving-cert\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.347259 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.347242 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5v8\" (UniqueName: \"kubernetes.io/projected/d2cb1832-d60b-4124-bae2-26882e380037-kube-api-access-ck5v8\") pod \"console-56bd7886c-mnnsv\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.529661 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.529571 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:42.652123 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:42.652092 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56bd7886c-mnnsv"] Apr 23 08:50:42.655890 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:42.655845 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2cb1832_d60b_4124_bae2_26882e380037.slice/crio-efa1de92e692605e5418b1f332d9e253ae95feb7f738ec578320d802885167c7 WatchSource:0}: Error finding container efa1de92e692605e5418b1f332d9e253ae95feb7f738ec578320d802885167c7: Status 404 returned error can't find the container with id efa1de92e692605e5418b1f332d9e253ae95feb7f738ec578320d802885167c7 Apr 23 08:50:43.552422 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:43.552371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56bd7886c-mnnsv" event={"ID":"d2cb1832-d60b-4124-bae2-26882e380037","Type":"ContainerStarted","Data":"c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180"} Apr 23 08:50:43.552422 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:43.552421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56bd7886c-mnnsv" event={"ID":"d2cb1832-d60b-4124-bae2-26882e380037","Type":"ContainerStarted","Data":"efa1de92e692605e5418b1f332d9e253ae95feb7f738ec578320d802885167c7"} Apr 23 08:50:43.572074 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:43.572035 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56bd7886c-mnnsv" podStartSLOduration=1.572022017 podStartE2EDuration="1.572022017s" podCreationTimestamp="2026-04-23 08:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:43.570164213 +0000 UTC m=+155.087023586" watchObservedRunningTime="2026-04-23 08:50:43.572022017 +0000 UTC m=+155.088881329" Apr 23 08:50:43.914230 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:43.914198 2574 patch_prober.go:28] interesting pod/image-registry-5bff4876d9-6ljch container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:43.914372 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:43.914249 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" podUID="92f47ad8-406e-4c78-b8b6-7565d305e19d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:44.300936 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:44.300861 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:44.300936 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:44.300904 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:50:44.302143 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:44.302117 2574 patch_prober.go:28] interesting pod/console-9949b69cb-2t8pg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" start-of-body= Apr 23 08:50:44.302246 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:44.302163 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-9949b69cb-2t8pg" podUID="9fbfde04-c427-426f-95d3-93915990fa2a" containerName="console" probeResult="failure" output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" Apr 23 08:50:45.912075 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:45.911981 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fxlql" podUID="67ac0c42-3257-496f-9fbd-98d7b980d7d5" Apr 23 08:50:45.923140 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:45.923113 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pnnnr" podUID="85967c1f-fcc5-477d-949f-62d16a82fb18" Apr 23 08:50:46.537362 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.537327 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w"] Apr 23 08:50:46.541208 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.541188 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.542092 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.542064 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jjhq5"] Apr 23 08:50:46.545343 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.545324 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.546958 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.546730 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:50:46.546958 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.546754 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:50:46.546958 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.546777 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:50:46.548113 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.548095 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:50:46.548546 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.548530 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 08:50:46.548740 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.548665 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 08:50:46.548924 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.548861 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-tzt2f\"" Apr 23 08:50:46.549114 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.549099 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 08:50:46.549692 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.549675 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 08:50:46.550254 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.550051 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-r8sp7\"" Apr 23 08:50:46.557784 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.557766 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w"] Apr 23 08:50:46.561823 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.561634 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxlql" Apr 23 08:50:46.568020 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.567997 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jjhq5"] Apr 23 08:50:46.571192 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rkkg\" (UniqueName: \"kubernetes.io/projected/be5352cb-7773-4886-82c2-b96726cfae23-kube-api-access-8rkkg\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.571287 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2661932-d940-478a-a53c-e34fb0fafcf8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.571287 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.571417 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.571417 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571334 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.571417 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/be5352cb-7773-4886-82c2-b96726cfae23-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.571616 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.571616 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571551 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.571616 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be5352cb-7773-4886-82c2-b96726cfae23-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.571754 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.571638 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcpdg\" (UniqueName: \"kubernetes.io/projected/c2661932-d940-478a-a53c-e34fb0fafcf8-kube-api-access-gcpdg\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.591862 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.591840 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-htrdc"] Apr 23 08:50:46.595139 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.595123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.598350 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.598335 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:50:46.599586 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.599568 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:50:46.602472 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.600801 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bp5x8\"" Apr 23 08:50:46.602472 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.601165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:50:46.672605 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.672575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32bc556f-c293-4692-aaf0-8ddd74da2d7e-metrics-client-ca\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.673135 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.672624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.673213 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.673213 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.673308 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-sys\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.673308 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/be5352cb-7773-4886-82c2-b96726cfae23-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.673308 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:46.673292 2574 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 08:50:46.673470 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:46.673371 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-tls podName:be5352cb-7773-4886-82c2-b96726cfae23 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:47.173351712 +0000 UTC m=+158.690211027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jjhq5" (UID: "be5352cb-7773-4886-82c2-b96726cfae23") : secret "kube-state-metrics-tls" not found Apr 23 08:50:46.673470 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-tls\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.673470 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-accelerators-collector-config\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.673618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673481 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-wtmp\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.673618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.673618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.673618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/be5352cb-7773-4886-82c2-b96726cfae23-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.673930 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.673909 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.674085 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be5352cb-7773-4886-82c2-b96726cfae23-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-root\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674246 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcpdg\" (UniqueName: \"kubernetes.io/projected/c2661932-d940-478a-a53c-e34fb0fafcf8-kube-api-access-gcpdg\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxs5j\" (UniqueName: \"kubernetes.io/projected/32bc556f-c293-4692-aaf0-8ddd74da2d7e-kube-api-access-fxs5j\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:46.674317 2574 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rkkg\" (UniqueName: \"kubernetes.io/projected/be5352cb-7773-4886-82c2-b96726cfae23-kube-api-access-8rkkg\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2661932-d940-478a-a53c-e34fb0fafcf8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:46.674376 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-tls podName:c2661932-d940-478a-a53c-e34fb0fafcf8 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:47.17436056 +0000 UTC m=+158.691219878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-dzd2w" (UID: "c2661932-d940-478a-a53c-e34fb0fafcf8") : secret "openshift-state-metrics-tls" not found Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674425 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-textfile\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.674714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.674659 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be5352cb-7773-4886-82c2-b96726cfae23-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.675322 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.675009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2661932-d940-478a-a53c-e34fb0fafcf8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.676626 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.676597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.676919 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.676903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.719229 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.719201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rkkg\" (UniqueName: \"kubernetes.io/projected/be5352cb-7773-4886-82c2-b96726cfae23-kube-api-access-8rkkg\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:46.719740 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.719720 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcpdg\" (UniqueName: \"kubernetes.io/projected/c2661932-d940-478a-a53c-e34fb0fafcf8-kube-api-access-gcpdg\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:46.775635 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-tls\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.775756 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-accelerators-collector-config\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.775756 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-wtmp\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.775866 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-wtmp\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.775866 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-root\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.775965 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775872 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxs5j\" (UniqueName: \"kubernetes.io/projected/32bc556f-c293-4692-aaf0-8ddd74da2d7e-kube-api-access-fxs5j\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.775965 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-root\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.775965 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.776108 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.775982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-textfile\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.776108 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.776032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32bc556f-c293-4692-aaf0-8ddd74da2d7e-metrics-client-ca\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.776108 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.776075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-sys\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.776256 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.776157 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32bc556f-c293-4692-aaf0-8ddd74da2d7e-sys\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.776345 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.776322 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-accelerators-collector-config\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.776345 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.776333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-textfile\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.776622 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.776587 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32bc556f-c293-4692-aaf0-8ddd74da2d7e-metrics-client-ca\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.778217 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.778191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.778322 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.778307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/32bc556f-c293-4692-aaf0-8ddd74da2d7e-node-exporter-tls\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.785418 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.785379 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxs5j\" (UniqueName: \"kubernetes.io/projected/32bc556f-c293-4692-aaf0-8ddd74da2d7e-kube-api-access-fxs5j\") pod \"node-exporter-htrdc\" (UID: \"32bc556f-c293-4692-aaf0-8ddd74da2d7e\") " pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.910849 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:46.910820 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-htrdc" Apr 23 08:50:46.919772 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:46.919750 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32bc556f_c293_4692_aaf0_8ddd74da2d7e.slice/crio-c563f09066a2f1780a0bbcdecb872561c60ced2cbd82c0762c1a5d204e02fd40 WatchSource:0}: Error finding container c563f09066a2f1780a0bbcdecb872561c60ced2cbd82c0762c1a5d204e02fd40: Status 404 returned error can't find the container with id c563f09066a2f1780a0bbcdecb872561c60ced2cbd82c0762c1a5d204e02fd40 Apr 23 08:50:47.075322 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:47.075277 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wnmff" podUID="b87e8c66-90ff-454c-9c82-3fe28797e8df" Apr 23 08:50:47.179236 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.179156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:47.179236 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.179207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:47.181654 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.181629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2661932-d940-478a-a53c-e34fb0fafcf8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dzd2w\" (UID: \"c2661932-d940-478a-a53c-e34fb0fafcf8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:47.181762 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.181717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5352cb-7773-4886-82c2-b96726cfae23-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jjhq5\" (UID: \"be5352cb-7773-4886-82c2-b96726cfae23\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:47.455636 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.455563 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" Apr 23 08:50:47.461367 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.461342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" Apr 23 08:50:47.566968 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.566933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-htrdc" event={"ID":"32bc556f-c293-4692-aaf0-8ddd74da2d7e","Type":"ContainerStarted","Data":"c563f09066a2f1780a0bbcdecb872561c60ced2cbd82c0762c1a5d204e02fd40"} Apr 23 08:50:47.585542 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.585325 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:50:47.591028 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.591006 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.594247 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594064 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 08:50:47.594247 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594209 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 08:50:47.594461 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594314 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 08:50:47.594461 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594338 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 08:50:47.594714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594599 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 08:50:47.594714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594623 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 08:50:47.594714 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594678 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 08:50:47.594910 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-889gt\"" Apr 23 08:50:47.594973 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 08:50:47.594973 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.594960 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 08:50:47.604633 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.604437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:50:47.606049 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.606030 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w"] Apr 23 08:50:47.637552 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.637527 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jjhq5"] Apr 23 08:50:47.683354 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/088441f8-e5d3-4665-81e0-ded5627fa821-config-out\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683491 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683491 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683412 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683607 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683607 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683537 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/088441f8-e5d3-4665-81e0-ded5627fa821-tls-assets\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683607 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683607 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683760 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683644 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-web-config\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683760 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683852 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvssp\" (UniqueName: \"kubernetes.io/projected/088441f8-e5d3-4665-81e0-ded5627fa821-kube-api-access-bvssp\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683852 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683802 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683927 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.683927 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.683882 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-config-volume\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.727347 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:47.727269 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2661932_d940_478a_a53c_e34fb0fafcf8.slice/crio-282beab3c9d914f70be13cb323580e5eb0bbd384571d7a90a377466622bb51be WatchSource:0}: Error finding container 282beab3c9d914f70be13cb323580e5eb0bbd384571d7a90a377466622bb51be: Status 404 returned error can't find the container with id 282beab3c9d914f70be13cb323580e5eb0bbd384571d7a90a377466622bb51be Apr 23 08:50:47.728022 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:47.727989 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5352cb_7773_4886_82c2_b96726cfae23.slice/crio-638f411beb88995db14046d910eb59bbd3012c360eb0b5ec91e773410e0d9e6c WatchSource:0}: Error finding container 638f411beb88995db14046d910eb59bbd3012c360eb0b5ec91e773410e0d9e6c: Status 404 returned error can't find the container with id 638f411beb88995db14046d910eb59bbd3012c360eb0b5ec91e773410e0d9e6c Apr 23 08:50:47.785084 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785200 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/088441f8-e5d3-4665-81e0-ded5627fa821-tls-assets\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785200 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785200 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785200 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-web-config\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785415 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785415 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:47.785229 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-trusted-ca-bundle podName:088441f8-e5d3-4665-81e0-ded5627fa821 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:48.285207427 +0000 UTC m=+159.802066744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "088441f8-e5d3-4665-81e0-ded5627fa821") : configmap references non-existent config key: ca-bundle.crt Apr 23 08:50:47.785415 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvssp\" (UniqueName: \"kubernetes.io/projected/088441f8-e5d3-4665-81e0-ded5627fa821-kube-api-access-bvssp\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785415 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785415 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785663 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-config-volume\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785663 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/088441f8-e5d3-4665-81e0-ded5627fa821-config-out\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785663 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.785663 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.786015 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.785900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.786015 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:47.785994 2574 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 08:50:47.786120 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:47.786042 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-main-tls podName:088441f8-e5d3-4665-81e0-ded5627fa821 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:48.28602604 +0000 UTC m=+159.802885364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "088441f8-e5d3-4665-81e0-ded5627fa821") : secret "alertmanager-main-tls" not found Apr 23 08:50:47.786798 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.786627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.789871 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.789849 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/088441f8-e5d3-4665-81e0-ded5627fa821-config-out\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.790342 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.790309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-web-config\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.790759 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.790738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.790928 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.790889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.791269 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.791246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-config-volume\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.791458 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.791377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/088441f8-e5d3-4665-81e0-ded5627fa821-tls-assets\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.791458 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.791425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.793480 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.793453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:47.794892 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:47.794850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvssp\" (UniqueName: \"kubernetes.io/projected/088441f8-e5d3-4665-81e0-ded5627fa821-kube-api-access-bvssp\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:48.291204 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.291165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:48.291204 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.291208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:48.291919 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.291898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088441f8-e5d3-4665-81e0-ded5627fa821-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:48.293618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.293601 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/088441f8-e5d3-4665-81e0-ded5627fa821-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"088441f8-e5d3-4665-81e0-ded5627fa821\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:48.507690 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.507647 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:50:48.571130 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.570945 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-57d897d645-7xzz9"] Apr 23 08:50:48.583463 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.582164 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.583637 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.583589 2574 generic.go:358] "Generic (PLEG): container finished" podID="32bc556f-c293-4692-aaf0-8ddd74da2d7e" containerID="97a3cfaaf0d18b61679b870f218bccea4599d0945cc86afe08eb90fb5ef87fee" exitCode=0 Apr 23 08:50:48.583835 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.583686 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-htrdc" event={"ID":"32bc556f-c293-4692-aaf0-8ddd74da2d7e","Type":"ContainerDied","Data":"97a3cfaaf0d18b61679b870f218bccea4599d0945cc86afe08eb90fb5ef87fee"} Apr 23 08:50:48.586275 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.585559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 08:50:48.586275 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.585849 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 08:50:48.586275 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.585879 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57d897d645-7xzz9"] Apr 23 08:50:48.586275 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.585945 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 08:50:48.586275 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.586025 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 08:50:48.586275 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.586123 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 08:50:48.586275 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.586197 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-7fnzw\"" Apr 23 08:50:48.587871 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.587769 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dg9l2b6ctfdck\"" Apr 23 08:50:48.590338 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.589093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" event={"ID":"be5352cb-7773-4886-82c2-b96726cfae23","Type":"ContainerStarted","Data":"638f411beb88995db14046d910eb59bbd3012c360eb0b5ec91e773410e0d9e6c"} Apr 23 08:50:48.591198 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.591169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" event={"ID":"c2661932-d940-478a-a53c-e34fb0fafcf8","Type":"ContainerStarted","Data":"ebcc1a844db039c636282182118b77f323241fd6949fdf55877178fb07e8cfde"} Apr 23 08:50:48.591285 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.591207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" event={"ID":"c2661932-d940-478a-a53c-e34fb0fafcf8","Type":"ContainerStarted","Data":"583de6d3fcb5839355e299b6333ca23590cfb58a01467d9015bbbd6705f41b26"} Apr 23 08:50:48.591285 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.591220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" event={"ID":"c2661932-d940-478a-a53c-e34fb0fafcf8","Type":"ContainerStarted","Data":"282beab3c9d914f70be13cb323580e5eb0bbd384571d7a90a377466622bb51be"} Apr 23 08:50:48.596512 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-grpc-tls\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.596607 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.596678 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596612 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.596678 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-tls\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.596784 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.596784 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596729 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.596784 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-metrics-client-ca\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.596944 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.596811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl26s\" (UniqueName: \"kubernetes.io/projected/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-kube-api-access-rl26s\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.667107 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.667062 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:50:48.673443 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:48.673416 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088441f8_e5d3_4665_81e0_ded5627fa821.slice/crio-4ebd6087f2fdb45b90f26dc375d47b151ade26e8ca9af76eea288c47f6c0b1d2 WatchSource:0}: Error finding container 4ebd6087f2fdb45b90f26dc375d47b151ade26e8ca9af76eea288c47f6c0b1d2: Status 404 returned error can't find the container with id 4ebd6087f2fdb45b90f26dc375d47b151ade26e8ca9af76eea288c47f6c0b1d2 Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-tls\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-metrics-client-ca\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl26s\" (UniqueName: \"kubernetes.io/projected/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-kube-api-access-rl26s\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.698517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.697814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-grpc-tls\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.700143 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.700117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-metrics-client-ca\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.701903 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.701879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-grpc-tls\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.702504 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.702482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.702594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.702570 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.702986 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.702962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.703065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.703048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-tls\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.704522 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.704503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.708972 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.708933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl26s\" (UniqueName: \"kubernetes.io/projected/8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3-kube-api-access-rl26s\") pod \"thanos-querier-57d897d645-7xzz9\" (UID: \"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3\") " pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:48.897964 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:48.897917 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:49.427404 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.427338 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57d897d645-7xzz9"] Apr 23 08:50:49.431512 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:49.430630 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8f8720_5c0e_48fa_93b5_fe7e1fa8f4c3.slice/crio-cc217e15b4b740371a684b17184a045e2433a21866fabb468eb1ccfb1b6b6d7c WatchSource:0}: Error finding container cc217e15b4b740371a684b17184a045e2433a21866fabb468eb1ccfb1b6b6d7c: Status 404 returned error can't find the container with id cc217e15b4b740371a684b17184a045e2433a21866fabb468eb1ccfb1b6b6d7c Apr 23 08:50:49.595580 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.595526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" event={"ID":"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3","Type":"ContainerStarted","Data":"cc217e15b4b740371a684b17184a045e2433a21866fabb468eb1ccfb1b6b6d7c"} Apr 23 08:50:49.597785 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.597752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-htrdc" event={"ID":"32bc556f-c293-4692-aaf0-8ddd74da2d7e","Type":"ContainerStarted","Data":"09ad896eb9626e7e1da9d89012844f7a94ba0ac422da888f7b22663812c9f179"} Apr 23 08:50:49.597785 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.597787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-htrdc" event={"ID":"32bc556f-c293-4692-aaf0-8ddd74da2d7e","Type":"ContainerStarted","Data":"1ee3a3bb3b6967cd51f54b3c87c745db4b721e65e5f1891ee1547c5a6f3043c9"} Apr 23 08:50:49.599380 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.599358 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerStarted","Data":"4ebd6087f2fdb45b90f26dc375d47b151ade26e8ca9af76eea288c47f6c0b1d2"} Apr 23 08:50:49.601687 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.601662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" event={"ID":"be5352cb-7773-4886-82c2-b96726cfae23","Type":"ContainerStarted","Data":"4c6bd517fbb397c3f885344ce21076adaa0f14cc40943dfee2c2235199e2c654"} Apr 23 08:50:49.601801 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.601694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" event={"ID":"be5352cb-7773-4886-82c2-b96726cfae23","Type":"ContainerStarted","Data":"1ef64fc4f41841c5928d08a6b598623d58050a35a202c51232dfac39d54a5c3b"} Apr 23 08:50:49.601801 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.601709 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" event={"ID":"be5352cb-7773-4886-82c2-b96726cfae23","Type":"ContainerStarted","Data":"8448874e60164c4dbc46b43928c27337f5eae285e9a5cf5f46cd83baaff04cc9"} Apr 23 08:50:49.604417 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.604369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" event={"ID":"c2661932-d940-478a-a53c-e34fb0fafcf8","Type":"ContainerStarted","Data":"876197d64cd8bd2c3a0a17740febc6bf6b388d1e744f65a9e9b70d0ca11a3831"} Apr 23 08:50:49.644902 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.644841 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jjhq5" podStartSLOduration=2.092960257 podStartE2EDuration="3.644824007s" podCreationTimestamp="2026-04-23 08:50:46 +0000 UTC" firstStartedPulling="2026-04-23 08:50:47.730131735 +0000 UTC m=+159.246991050" lastFinishedPulling="2026-04-23 08:50:49.281995483 +0000 UTC m=+160.798854800" observedRunningTime="2026-04-23 08:50:49.644589231 +0000 UTC m=+161.161448568" watchObservedRunningTime="2026-04-23 08:50:49.644824007 +0000 UTC m=+161.161683342" Apr 23 08:50:49.645443 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:49.645399 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-htrdc" podStartSLOduration=2.792096364 podStartE2EDuration="3.645372606s" podCreationTimestamp="2026-04-23 08:50:46 +0000 UTC" firstStartedPulling="2026-04-23 08:50:46.921305546 +0000 UTC m=+158.438164860" lastFinishedPulling="2026-04-23 08:50:47.774581789 +0000 UTC m=+159.291441102" observedRunningTime="2026-04-23 08:50:49.620055681 +0000 UTC m=+161.136915018" watchObservedRunningTime="2026-04-23 08:50:49.645372606 +0000 UTC m=+161.162231943" Apr 23 08:50:50.609350 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:50.609316 2574 generic.go:358] "Generic (PLEG): container finished" podID="088441f8-e5d3-4665-81e0-ded5627fa821" containerID="c82ef86506fd99159e7d8f75a17c708c656a310144b3b10d3743440efe9f309f" exitCode=0 Apr 23 08:50:50.609857 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:50.609363 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerDied","Data":"c82ef86506fd99159e7d8f75a17c708c656a310144b3b10d3743440efe9f309f"} Apr 23 08:50:50.654375 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:50.654329 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzd2w" podStartSLOduration=3.221735981 podStartE2EDuration="4.654313079s" podCreationTimestamp="2026-04-23 08:50:46 +0000 UTC" firstStartedPulling="2026-04-23 08:50:47.846722632 +0000 UTC m=+159.363581944" lastFinishedPulling="2026-04-23 08:50:49.279299715 +0000 UTC m=+160.796159042" observedRunningTime="2026-04-23 08:50:49.666554129 +0000 UTC m=+161.183413464" watchObservedRunningTime="2026-04-23 08:50:50.654313079 +0000 UTC m=+162.171172413" Apr 23 08:50:50.818352 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:50.818321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:50:50.818547 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:50.818365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:50:50.820849 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:50.820826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ac0c42-3257-496f-9fbd-98d7b980d7d5-metrics-tls\") pod \"dns-default-fxlql\" (UID: \"67ac0c42-3257-496f-9fbd-98d7b980d7d5\") " pod="openshift-dns/dns-default-fxlql" Apr 23 08:50:50.821078 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:50.821059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85967c1f-fcc5-477d-949f-62d16a82fb18-cert\") pod \"ingress-canary-pnnnr\" (UID: \"85967c1f-fcc5-477d-949f-62d16a82fb18\") " pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:50:51.015280 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.015192 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-666ff68b9f-kzfmt"] Apr 23 08:50:51.019009 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.018983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.021977 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.021891 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-m7p75\"" Apr 23 08:50:51.021977 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.021920 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 08:50:51.021977 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.021933 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 08:50:51.023215 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.023187 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 08:50:51.023336 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.023243 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-du49ukvcmrjbo\"" Apr 23 08:50:51.023336 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.023322 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 08:50:51.026951 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.026930 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-666ff68b9f-kzfmt"] Apr 23 08:50:51.066035 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.066003 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hskmq\"" Apr 23 08:50:51.073169 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.073144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxlql" Apr 23 08:50:51.120511 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.120476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44466\" (UniqueName: \"kubernetes.io/projected/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-kube-api-access-44466\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.120690 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.120552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-client-ca-bundle\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.120690 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.120657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-secret-metrics-server-tls\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.120803 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.120711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-audit-log\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.120803 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.120786 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-metrics-server-audit-profiles\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.120914 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.120825 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-secret-metrics-server-client-certs\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.120914 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.120851 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.221523 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.221486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-client-ca-bundle\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.221707 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.221549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-secret-metrics-server-tls\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.221707 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.221587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-audit-log\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.221707 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.221641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-metrics-server-audit-profiles\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.221873 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.221700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-secret-metrics-server-client-certs\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.221873 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.221739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.221873 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.221776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44466\" (UniqueName: \"kubernetes.io/projected/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-kube-api-access-44466\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.222728 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.222092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-audit-log\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.222728 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.222682 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-metrics-server-audit-profiles\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.222965 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.222730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.224506 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.224483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-client-ca-bundle\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.224711 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.224688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-secret-metrics-server-client-certs\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.224913 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.224889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-secret-metrics-server-tls\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.235486 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.235456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44466\" (UniqueName: \"kubernetes.io/projected/2ed1b7b6-6eff-4336-96e8-84d4433bb4bc-kube-api-access-44466\") pod \"metrics-server-666ff68b9f-kzfmt\" (UID: \"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc\") " pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.331113 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.331082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:50:51.616053 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.616024 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-666ff68b9f-kzfmt"] Apr 23 08:50:51.618129 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:51.618104 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed1b7b6_6eff_4336_96e8_84d4433bb4bc.slice/crio-770846b5e30696ff6e082c355077982d503889cc07193174877fddb7562cea4b WatchSource:0}: Error finding container 770846b5e30696ff6e082c355077982d503889cc07193174877fddb7562cea4b: Status 404 returned error can't find the container with id 770846b5e30696ff6e082c355077982d503889cc07193174877fddb7562cea4b Apr 23 08:50:51.634792 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:51.634770 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fxlql"] Apr 23 08:50:51.637759 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:50:51.637732 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ac0c42_3257_496f_9fbd_98d7b980d7d5.slice/crio-84b53b5b50e5fe5bf2bcea2a02606af7dc351aeb9db6c68419e2aabbc57cdfd6 WatchSource:0}: Error finding container 84b53b5b50e5fe5bf2bcea2a02606af7dc351aeb9db6c68419e2aabbc57cdfd6: Status 404 returned error can't find the container with id 84b53b5b50e5fe5bf2bcea2a02606af7dc351aeb9db6c68419e2aabbc57cdfd6 Apr 23 08:50:52.530351 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.530303 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:52.530565 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.530421 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:50:52.532459 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.532041 2574 patch_prober.go:28] interesting pod/console-56bd7886c-mnnsv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.18:8443/health\": dial tcp 10.134.0.18:8443: connect: connection refused" start-of-body= Apr 23 08:50:52.532459 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.532091 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-56bd7886c-mnnsv" podUID="d2cb1832-d60b-4124-bae2-26882e380037" containerName="console" probeResult="failure" output="Get \"https://10.134.0.18:8443/health\": dial tcp 10.134.0.18:8443: connect: connection refused" Apr 23 08:50:52.621376 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.621276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" event={"ID":"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3","Type":"ContainerStarted","Data":"1d0ae91c24988d406980aa50763a2904a4e0f83c8ce432f1f46e87eec2eb1088"} Apr 23 08:50:52.621376 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.621323 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" event={"ID":"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3","Type":"ContainerStarted","Data":"071dd889de2470a235d8eee72ed633fc234fd437f170461fb32da042e54863e4"} Apr 23 08:50:52.621376 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.621339 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" event={"ID":"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3","Type":"ContainerStarted","Data":"6190cf0a5317c6341d6edaebca1fd8109610e00fa759967f99553ccafd45196d"} Apr 23 08:50:52.622790 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.622713 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxlql" event={"ID":"67ac0c42-3257-496f-9fbd-98d7b980d7d5","Type":"ContainerStarted","Data":"84b53b5b50e5fe5bf2bcea2a02606af7dc351aeb9db6c68419e2aabbc57cdfd6"} Apr 23 08:50:52.624907 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:52.624874 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" event={"ID":"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc","Type":"ContainerStarted","Data":"770846b5e30696ff6e082c355077982d503889cc07193174877fddb7562cea4b"} Apr 23 08:50:53.245975 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.245941 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9949b69cb-2t8pg"] Apr 23 08:50:53.284313 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.284289 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-699bc9f49f-v54rl"] Apr 23 08:50:53.288029 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.288012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.300011 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.299985 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699bc9f49f-v54rl"] Apr 23 08:50:53.341615 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.341578 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-oauth-config\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.341745 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.341712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tfkv\" (UniqueName: \"kubernetes.io/projected/24bf8987-c434-4d6c-896d-3e931cd9807a-kube-api-access-6tfkv\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.341803 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.341745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-serving-cert\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.341803 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.341768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-service-ca\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.341902 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.341805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-trusted-ca-bundle\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.341902 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.341859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-console-config\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.341988 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.341918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-oauth-serving-cert\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.442791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.442747 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-console-config\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.442971 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.442829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-oauth-serving-cert\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.442971 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.442856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-oauth-config\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.442971 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.442939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfkv\" (UniqueName: \"kubernetes.io/projected/24bf8987-c434-4d6c-896d-3e931cd9807a-kube-api-access-6tfkv\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.443125 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.442977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-serving-cert\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.443125 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.443002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-service-ca\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.443125 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.443027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-trusted-ca-bundle\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.443704 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.443674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-console-config\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.443823 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.443736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-oauth-serving-cert\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.444578 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.444529 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-service-ca\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.444692 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.444631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-trusted-ca-bundle\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.445841 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.445817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-oauth-config\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.445971 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.445954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-serving-cert\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.462055 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.462011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfkv\" (UniqueName: \"kubernetes.io/projected/24bf8987-c434-4d6c-896d-3e931cd9807a-kube-api-access-6tfkv\") pod \"console-699bc9f49f-v54rl\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.598424 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.598326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:50:53.823708 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.823656 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699bc9f49f-v54rl"] Apr 23 08:50:53.922138 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.922033 2574 patch_prober.go:28] interesting pod/image-registry-5bff4876d9-6ljch container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:53.922138 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:53.922088 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" podUID="92f47ad8-406e-4c78-b8b6-7565d305e19d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:54.633983 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.633950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerStarted","Data":"d470236c920f1315692258d926b5bc827d3df8271da4a32cf01571a96543a165"} Apr 23 08:50:54.634159 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.633988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerStarted","Data":"d419c6d1e8ae228ffa39779904c57f9ed018e8b4cef95ca67582abc7d7fc68c8"} Apr 23 08:50:54.634159 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.634001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerStarted","Data":"9fcad7e7809bf42fa7e1406b93286e97c7a456e3fe9c460f1c3045ff996026a6"} Apr 23 08:50:54.634159 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.634013 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerStarted","Data":"ef808460e81c30390a5ca60c2c67fd5a7b51c91012f8d630e20c583a483458fc"} Apr 23 08:50:54.634159 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.634024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerStarted","Data":"8359496f394066fa804c241d28057b176aecfc12456687d5ff5a20c12d489560"} Apr 23 08:50:54.634159 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.634035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"088441f8-e5d3-4665-81e0-ded5627fa821","Type":"ContainerStarted","Data":"a5438151404183b4de8bc7333c0bfec3826d456620bc4fa1007a4c77e22fb6cb"} Apr 23 08:50:54.635491 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.635460 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699bc9f49f-v54rl" event={"ID":"24bf8987-c434-4d6c-896d-3e931cd9807a","Type":"ContainerStarted","Data":"cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff"} Apr 23 08:50:54.635622 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.635497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699bc9f49f-v54rl" event={"ID":"24bf8987-c434-4d6c-896d-3e931cd9807a","Type":"ContainerStarted","Data":"90b0ec22d1b40fbcdc66b64eac8bb2998ed569706ff2191bbb86e4cc51c7a49c"} Apr 23 08:50:54.637844 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.637808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" event={"ID":"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3","Type":"ContainerStarted","Data":"bf252e6de3f6fdb2e134ecd6c0177473caad6f6880def9070c6f7db522493af9"} Apr 23 08:50:54.637957 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.637851 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" event={"ID":"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3","Type":"ContainerStarted","Data":"abdc13a7cd7f1d5aa33e2c92a1b7f6f0ebc84470bfb3cfc12a9c0f8105262333"} Apr 23 08:50:54.637957 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.637864 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" event={"ID":"8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3","Type":"ContainerStarted","Data":"5d927a621b02b65e5f744f47e9fbee9a71f6c894992e31f87ba181979fe449a4"} Apr 23 08:50:54.638062 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.638007 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:50:54.642335 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.642282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxlql" event={"ID":"67ac0c42-3257-496f-9fbd-98d7b980d7d5","Type":"ContainerStarted","Data":"d419388dcb154b8bd47584b763fd025a3442ec8ed7deef37ccf5b0c74bdd1a3d"} Apr 23 08:50:54.642335 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.642313 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxlql" event={"ID":"67ac0c42-3257-496f-9fbd-98d7b980d7d5","Type":"ContainerStarted","Data":"c1411fc0f3c7c46c54cedfdd7c872a8b17087bbed089d8249538d87c82cd214b"} Apr 23 08:50:54.642509 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.642444 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fxlql" Apr 23 08:50:54.643673 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.643645 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" event={"ID":"2ed1b7b6-6eff-4336-96e8-84d4433bb4bc","Type":"ContainerStarted","Data":"eecf0426dcf4ed445a07f0dfdaa10338c6d5776aef0a0f0f29737e83369e2986"} Apr 23 08:50:54.667500 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.667462 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.676204543 podStartE2EDuration="7.667451652s" podCreationTimestamp="2026-04-23 08:50:47 +0000 UTC" firstStartedPulling="2026-04-23 08:50:48.676045559 +0000 UTC m=+160.192904872" lastFinishedPulling="2026-04-23 08:50:53.667292655 +0000 UTC m=+165.184151981" observedRunningTime="2026-04-23 08:50:54.665547609 +0000 UTC m=+166.182406945" watchObservedRunningTime="2026-04-23 08:50:54.667451652 +0000 UTC m=+166.184310986" Apr 23 08:50:54.685361 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.685323 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" podStartSLOduration=2.5812136949999998 podStartE2EDuration="4.685311839s" podCreationTimestamp="2026-04-23 08:50:50 +0000 UTC" firstStartedPulling="2026-04-23 08:50:51.620042076 +0000 UTC m=+163.136901389" lastFinishedPulling="2026-04-23 08:50:53.724140206 +0000 UTC m=+165.240999533" observedRunningTime="2026-04-23 08:50:54.683301111 +0000 UTC m=+166.200160447" watchObservedRunningTime="2026-04-23 08:50:54.685311839 +0000 UTC m=+166.202171174" Apr 23 08:50:54.702894 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.702856 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699bc9f49f-v54rl" podStartSLOduration=1.702842903 podStartE2EDuration="1.702842903s" podCreationTimestamp="2026-04-23 08:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:54.701378669 +0000 UTC m=+166.218238005" watchObservedRunningTime="2026-04-23 08:50:54.702842903 +0000 UTC m=+166.219702238" Apr 23 08:50:54.728024 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.727985 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" podStartSLOduration=2.4917017870000002 podStartE2EDuration="6.727975689s" podCreationTimestamp="2026-04-23 08:50:48 +0000 UTC" firstStartedPulling="2026-04-23 08:50:49.432618831 +0000 UTC m=+160.949478148" lastFinishedPulling="2026-04-23 08:50:53.668892738 +0000 UTC m=+165.185752050" observedRunningTime="2026-04-23 08:50:54.727411382 +0000 UTC m=+166.244270723" watchObservedRunningTime="2026-04-23 08:50:54.727975689 +0000 UTC m=+166.244835024" Apr 23 08:50:54.747340 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:54.747288 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fxlql" podStartSLOduration=130.718346269 podStartE2EDuration="2m12.747278416s" podCreationTimestamp="2026-04-23 08:48:42 +0000 UTC" firstStartedPulling="2026-04-23 08:50:51.64186259 +0000 UTC m=+163.158721904" lastFinishedPulling="2026-04-23 08:50:53.670794733 +0000 UTC m=+165.187654051" observedRunningTime="2026-04-23 08:50:54.746460855 +0000 UTC m=+166.263320189" watchObservedRunningTime="2026-04-23 08:50:54.747278416 +0000 UTC m=+166.264137751" Apr 23 08:50:58.066010 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:58.065957 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:50:58.932178 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:58.932133 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" podUID="92f47ad8-406e-4c78-b8b6-7565d305e19d" containerName="registry" containerID="cri-o://c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423" gracePeriod=30 Apr 23 08:50:59.165443 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.165420 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:59.294618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294538 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-bound-sa-token\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.294618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294580 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-certificates\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.294618 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294617 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f47ad8-406e-4c78-b8b6-7565d305e19d-ca-trust-extracted\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.294872 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294719 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-image-registry-private-configuration\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.294872 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294768 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-trusted-ca\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.294872 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294804 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.294872 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294827 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-installation-pull-secrets\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.294872 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.294850 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v87d\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-kube-api-access-2v87d\") pod \"92f47ad8-406e-4c78-b8b6-7565d305e19d\" (UID: \"92f47ad8-406e-4c78-b8b6-7565d305e19d\") " Apr 23 08:50:59.295291 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.295260 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:50:59.295709 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.295660 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:50:59.297261 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.297231 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-kube-api-access-2v87d" (OuterVolumeSpecName: "kube-api-access-2v87d") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "kube-api-access-2v87d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:50:59.297349 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.297288 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:50:59.297467 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.297378 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:50:59.297467 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.297454 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:50:59.297573 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.297482 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:50:59.304135 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.304112 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92f47ad8-406e-4c78-b8b6-7565d305e19d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "92f47ad8-406e-4c78-b8b6-7565d305e19d" (UID: "92f47ad8-406e-4c78-b8b6-7565d305e19d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:50:59.395621 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395595 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-bound-sa-token\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.395621 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395618 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-certificates\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.395766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395629 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f47ad8-406e-4c78-b8b6-7565d305e19d-ca-trust-extracted\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.395766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395639 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-image-registry-private-configuration\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.395766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395648 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f47ad8-406e-4c78-b8b6-7565d305e19d-trusted-ca\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.395766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395656 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-registry-tls\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.395766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395664 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f47ad8-406e-4c78-b8b6-7565d305e19d-installation-pull-secrets\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.395766 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.395672 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2v87d\" (UniqueName: \"kubernetes.io/projected/92f47ad8-406e-4c78-b8b6-7565d305e19d-kube-api-access-2v87d\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:50:59.659485 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.659456 2574 generic.go:358] "Generic (PLEG): container finished" podID="92f47ad8-406e-4c78-b8b6-7565d305e19d" containerID="c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423" exitCode=0 Apr 23 08:50:59.659643 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.659516 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" Apr 23 08:50:59.659643 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.659553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" event={"ID":"92f47ad8-406e-4c78-b8b6-7565d305e19d","Type":"ContainerDied","Data":"c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423"} Apr 23 08:50:59.659643 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.659591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bff4876d9-6ljch" event={"ID":"92f47ad8-406e-4c78-b8b6-7565d305e19d","Type":"ContainerDied","Data":"082b348038db114e5d61c9ae1e9da2c4eb2444bfd159eab49552411521f6411a"} Apr 23 08:50:59.659643 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.659607 2574 scope.go:117] "RemoveContainer" containerID="c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423" Apr 23 08:50:59.668501 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.668484 2574 scope.go:117] "RemoveContainer" containerID="c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423" Apr 23 08:50:59.668766 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:50:59.668748 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423\": container with ID starting with c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423 not found: ID does not exist" containerID="c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423" Apr 23 08:50:59.668813 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.668778 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423"} err="failed to get container status \"c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423\": rpc error: code = NotFound desc = could not find container \"c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423\": container with ID starting with c0ee9bf6009c535d22ff3c9c41c21dfc234b4aa8dc108309a9b0023ca6865423 not found: ID does not exist" Apr 23 08:50:59.681090 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.681065 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bff4876d9-6ljch"] Apr 23 08:50:59.684160 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:50:59.684136 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5bff4876d9-6ljch"] Apr 23 08:51:00.065898 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:00.065825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:51:00.068975 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:00.068934 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-55hsm\"" Apr 23 08:51:00.076546 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:00.076523 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pnnnr" Apr 23 08:51:00.200221 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:00.200199 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pnnnr"] Apr 23 08:51:00.202650 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:51:00.202622 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85967c1f_fcc5_477d_949f_62d16a82fb18.slice/crio-26741e20a2ba0d341a5b03af6dc5f7192ea2e7f25cd25d897c031e949a6a974b WatchSource:0}: Error finding container 26741e20a2ba0d341a5b03af6dc5f7192ea2e7f25cd25d897c031e949a6a974b: Status 404 returned error can't find the container with id 26741e20a2ba0d341a5b03af6dc5f7192ea2e7f25cd25d897c031e949a6a974b Apr 23 08:51:00.655824 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:00.655796 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-57d897d645-7xzz9" Apr 23 08:51:00.668815 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:00.668776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pnnnr" event={"ID":"85967c1f-fcc5-477d-949f-62d16a82fb18","Type":"ContainerStarted","Data":"26741e20a2ba0d341a5b03af6dc5f7192ea2e7f25cd25d897c031e949a6a974b"} Apr 23 08:51:01.071065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:01.070981 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f47ad8-406e-4c78-b8b6-7565d305e19d" path="/var/lib/kubelet/pods/92f47ad8-406e-4c78-b8b6-7565d305e19d/volumes" Apr 23 08:51:02.530777 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:02.530741 2574 patch_prober.go:28] interesting pod/console-56bd7886c-mnnsv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.18:8443/health\": dial tcp 10.134.0.18:8443: connect: connection refused" start-of-body= Apr 23 08:51:02.531142 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:02.530788 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-56bd7886c-mnnsv" podUID="d2cb1832-d60b-4124-bae2-26882e380037" containerName="console" probeResult="failure" output="Get \"https://10.134.0.18:8443/health\": dial tcp 10.134.0.18:8443: connect: connection refused" Apr 23 08:51:02.677011 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:02.676971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pnnnr" event={"ID":"85967c1f-fcc5-477d-949f-62d16a82fb18","Type":"ContainerStarted","Data":"af976a97c1acfe009863fd3cf9440578ef0e642012292f56a7d15feeb72dd01c"} Apr 23 08:51:02.695245 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:02.695196 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pnnnr" podStartSLOduration=138.906263453 podStartE2EDuration="2m20.695183192s" podCreationTimestamp="2026-04-23 08:48:42 +0000 UTC" firstStartedPulling="2026-04-23 08:51:00.204567032 +0000 UTC m=+171.721426351" lastFinishedPulling="2026-04-23 08:51:01.993486775 +0000 UTC m=+173.510346090" observedRunningTime="2026-04-23 08:51:02.69477801 +0000 UTC m=+174.211637345" watchObservedRunningTime="2026-04-23 08:51:02.695183192 +0000 UTC m=+174.212042527" Apr 23 08:51:03.599478 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:03.599434 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:51:03.599478 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:03.599489 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:51:03.600995 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:03.600970 2574 patch_prober.go:28] interesting pod/console-699bc9f49f-v54rl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.24:8443/health\": dial tcp 10.134.0.24:8443: connect: connection refused" start-of-body= Apr 23 08:51:03.601106 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:03.601027 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-699bc9f49f-v54rl" podUID="24bf8987-c434-4d6c-896d-3e931cd9807a" containerName="console" probeResult="failure" output="Get \"https://10.134.0.24:8443/health\": dial tcp 10.134.0.24:8443: connect: connection refused" Apr 23 08:51:04.650209 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:04.650181 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fxlql" Apr 23 08:51:11.331455 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:11.331419 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:51:11.331823 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:11.331465 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:51:12.535678 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:12.535645 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:51:12.539436 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:12.539416 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:51:13.603178 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:13.603152 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:51:13.606888 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:13.606860 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:51:13.669946 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:13.669915 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56bd7886c-mnnsv"] Apr 23 08:51:17.722900 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:17.722868 2574 generic.go:358] "Generic (PLEG): container finished" podID="eba944cc-6425-4433-beef-e901be182078" containerID="11a08b618450dc00465ed70d161bdc1662aeb1e9af6c5b3e778e4ea63313fba0" exitCode=0 Apr 23 08:51:17.723226 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:17.722933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" event={"ID":"eba944cc-6425-4433-beef-e901be182078","Type":"ContainerDied","Data":"11a08b618450dc00465ed70d161bdc1662aeb1e9af6c5b3e778e4ea63313fba0"} Apr 23 08:51:17.723291 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:17.723248 2574 scope.go:117] "RemoveContainer" containerID="11a08b618450dc00465ed70d161bdc1662aeb1e9af6c5b3e778e4ea63313fba0" Apr 23 08:51:18.265471 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.265423 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9949b69cb-2t8pg" podUID="9fbfde04-c427-426f-95d3-93915990fa2a" containerName="console" containerID="cri-o://0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b" gracePeriod=15 Apr 23 08:51:18.495756 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.495735 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9949b69cb-2t8pg_9fbfde04-c427-426f-95d3-93915990fa2a/console/0.log" Apr 23 08:51:18.495851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.495793 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:51:18.665328 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665297 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-service-ca\") pod \"9fbfde04-c427-426f-95d3-93915990fa2a\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " Apr 23 08:51:18.665531 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665338 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-oauth-serving-cert\") pod \"9fbfde04-c427-426f-95d3-93915990fa2a\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " Apr 23 08:51:18.665531 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665362 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-console-config\") pod \"9fbfde04-c427-426f-95d3-93915990fa2a\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " Apr 23 08:51:18.665531 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665381 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-oauth-config\") pod \"9fbfde04-c427-426f-95d3-93915990fa2a\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " Apr 23 08:51:18.665531 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665509 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhjq2\" (UniqueName: \"kubernetes.io/projected/9fbfde04-c427-426f-95d3-93915990fa2a-kube-api-access-hhjq2\") pod \"9fbfde04-c427-426f-95d3-93915990fa2a\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " Apr 23 08:51:18.665729 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665570 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-serving-cert\") pod \"9fbfde04-c427-426f-95d3-93915990fa2a\" (UID: \"9fbfde04-c427-426f-95d3-93915990fa2a\") " Apr 23 08:51:18.665793 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665766 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-console-config" (OuterVolumeSpecName: "console-config") pod "9fbfde04-c427-426f-95d3-93915990fa2a" (UID: "9fbfde04-c427-426f-95d3-93915990fa2a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:18.665839 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665783 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9fbfde04-c427-426f-95d3-93915990fa2a" (UID: "9fbfde04-c427-426f-95d3-93915990fa2a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:18.665839 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665804 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-service-ca" (OuterVolumeSpecName: "service-ca") pod "9fbfde04-c427-426f-95d3-93915990fa2a" (UID: "9fbfde04-c427-426f-95d3-93915990fa2a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:18.665929 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665917 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-service-ca\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:18.665976 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665937 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-oauth-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:18.665976 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.665953 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fbfde04-c427-426f-95d3-93915990fa2a-console-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:18.667903 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.667879 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9fbfde04-c427-426f-95d3-93915990fa2a" (UID: "9fbfde04-c427-426f-95d3-93915990fa2a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:51:18.668019 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.667906 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9fbfde04-c427-426f-95d3-93915990fa2a" (UID: "9fbfde04-c427-426f-95d3-93915990fa2a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:51:18.668019 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.667940 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fbfde04-c427-426f-95d3-93915990fa2a-kube-api-access-hhjq2" (OuterVolumeSpecName: "kube-api-access-hhjq2") pod "9fbfde04-c427-426f-95d3-93915990fa2a" (UID: "9fbfde04-c427-426f-95d3-93915990fa2a"). InnerVolumeSpecName "kube-api-access-hhjq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:51:18.726722 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.726699 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9949b69cb-2t8pg_9fbfde04-c427-426f-95d3-93915990fa2a/console/0.log" Apr 23 08:51:18.727134 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.726736 2574 generic.go:358] "Generic (PLEG): container finished" podID="9fbfde04-c427-426f-95d3-93915990fa2a" containerID="0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b" exitCode=2 Apr 23 08:51:18.727134 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.726819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9949b69cb-2t8pg" event={"ID":"9fbfde04-c427-426f-95d3-93915990fa2a","Type":"ContainerDied","Data":"0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b"} Apr 23 08:51:18.727134 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.726821 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9949b69cb-2t8pg" Apr 23 08:51:18.727134 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.726848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9949b69cb-2t8pg" event={"ID":"9fbfde04-c427-426f-95d3-93915990fa2a","Type":"ContainerDied","Data":"e2bf01d201db10b791d691c1232106c9bc199ae211f6782c539d0c62f93e790e"} Apr 23 08:51:18.727134 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.726867 2574 scope.go:117] "RemoveContainer" containerID="0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b" Apr 23 08:51:18.728653 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.728630 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g296" event={"ID":"eba944cc-6425-4433-beef-e901be182078","Type":"ContainerStarted","Data":"7970a2ac5424f1263120555b5eb17b6ad13212fefe25a144e43be84839bf6f16"} Apr 23 08:51:18.736055 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.736029 2574 scope.go:117] "RemoveContainer" containerID="0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b" Apr 23 08:51:18.736296 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:51:18.736278 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b\": container with ID starting with 0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b not found: ID does not exist" containerID="0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b" Apr 23 08:51:18.736339 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.736303 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b"} err="failed to get container status \"0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b\": rpc error: code = NotFound desc = could not find container \"0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b\": container with ID starting with 0d8d0935792f8116d1644ccee4a29c7e2268f107efee6f49445c95755d32b26b not found: ID does not exist" Apr 23 08:51:18.764627 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.764600 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9949b69cb-2t8pg"] Apr 23 08:51:18.766655 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.766630 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhjq2\" (UniqueName: \"kubernetes.io/projected/9fbfde04-c427-426f-95d3-93915990fa2a-kube-api-access-hhjq2\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:18.766771 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.766660 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:18.766771 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.766679 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fbfde04-c427-426f-95d3-93915990fa2a-console-oauth-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:18.773099 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:18.773077 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9949b69cb-2t8pg"] Apr 23 08:51:19.069780 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:19.069691 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fbfde04-c427-426f-95d3-93915990fa2a" path="/var/lib/kubelet/pods/9fbfde04-c427-426f-95d3-93915990fa2a/volumes" Apr 23 08:51:22.742249 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:22.742211 2574 generic.go:358] "Generic (PLEG): container finished" podID="976abe21-f40b-42ec-b745-de58f1628c36" containerID="9cf952471261a79943d35e725d083f472a3e43b2b9f656e4c550d183729e380e" exitCode=0 Apr 23 08:51:22.742682 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:22.742290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wsgmm" event={"ID":"976abe21-f40b-42ec-b745-de58f1628c36","Type":"ContainerDied","Data":"9cf952471261a79943d35e725d083f472a3e43b2b9f656e4c550d183729e380e"} Apr 23 08:51:22.742745 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:22.742729 2574 scope.go:117] "RemoveContainer" containerID="9cf952471261a79943d35e725d083f472a3e43b2b9f656e4c550d183729e380e" Apr 23 08:51:23.746562 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:23.746526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wsgmm" event={"ID":"976abe21-f40b-42ec-b745-de58f1628c36","Type":"ContainerStarted","Data":"c0d2b0c239aac8fd0994c0a0dec3d0dc16e4038c6c95a5492c558d86b447c65c"} Apr 23 08:51:31.336842 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:31.336811 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:51:31.340711 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:31.340688 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-666ff68b9f-kzfmt" Apr 23 08:51:32.774537 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:32.774501 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ef9bf61-ead5-4bec-bd04-a167a6f7321f" containerID="9c4c05fa948b133540ac81b92d64ceaa3d87d3a79d7e70b2a812d42739919165" exitCode=0 Apr 23 08:51:32.774989 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:32.774566 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" event={"ID":"6ef9bf61-ead5-4bec-bd04-a167a6f7321f","Type":"ContainerDied","Data":"9c4c05fa948b133540ac81b92d64ceaa3d87d3a79d7e70b2a812d42739919165"} Apr 23 08:51:32.774989 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:32.774917 2574 scope.go:117] "RemoveContainer" containerID="9c4c05fa948b133540ac81b92d64ceaa3d87d3a79d7e70b2a812d42739919165" Apr 23 08:51:33.779533 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:33.779499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cvwjc" event={"ID":"6ef9bf61-ead5-4bec-bd04-a167a6f7321f","Type":"ContainerStarted","Data":"c6ee30d78af6c237dbf708596818f8b64dc4fe46d3c13600ef81ad4ec8514486"} Apr 23 08:51:38.729740 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:38.729680 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56bd7886c-mnnsv" podUID="d2cb1832-d60b-4124-bae2-26882e380037" containerName="console" containerID="cri-o://c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180" gracePeriod=15 Apr 23 08:51:38.977144 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:38.977123 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56bd7886c-mnnsv_d2cb1832-d60b-4124-bae2-26882e380037/console/0.log" Apr 23 08:51:38.977256 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:38.977182 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:51:39.049361 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049294 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-oauth-config\") pod \"d2cb1832-d60b-4124-bae2-26882e380037\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " Apr 23 08:51:39.049361 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049326 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck5v8\" (UniqueName: \"kubernetes.io/projected/d2cb1832-d60b-4124-bae2-26882e380037-kube-api-access-ck5v8\") pod \"d2cb1832-d60b-4124-bae2-26882e380037\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " Apr 23 08:51:39.049587 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049375 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-trusted-ca-bundle\") pod \"d2cb1832-d60b-4124-bae2-26882e380037\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " Apr 23 08:51:39.049587 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049419 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-console-config\") pod \"d2cb1832-d60b-4124-bae2-26882e380037\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " Apr 23 08:51:39.049587 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049443 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-service-ca\") pod \"d2cb1832-d60b-4124-bae2-26882e380037\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " Apr 23 08:51:39.049587 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049486 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-serving-cert\") pod \"d2cb1832-d60b-4124-bae2-26882e380037\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " Apr 23 08:51:39.049587 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049516 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-oauth-serving-cert\") pod \"d2cb1832-d60b-4124-bae2-26882e380037\" (UID: \"d2cb1832-d60b-4124-bae2-26882e380037\") " Apr 23 08:51:39.050049 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.049990 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-service-ca" (OuterVolumeSpecName: "service-ca") pod "d2cb1832-d60b-4124-bae2-26882e380037" (UID: "d2cb1832-d60b-4124-bae2-26882e380037"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:39.050049 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.050012 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d2cb1832-d60b-4124-bae2-26882e380037" (UID: "d2cb1832-d60b-4124-bae2-26882e380037"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:39.050049 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.050031 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-console-config" (OuterVolumeSpecName: "console-config") pod "d2cb1832-d60b-4124-bae2-26882e380037" (UID: "d2cb1832-d60b-4124-bae2-26882e380037"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:39.050218 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.050047 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d2cb1832-d60b-4124-bae2-26882e380037" (UID: "d2cb1832-d60b-4124-bae2-26882e380037"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:51:39.051777 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.051753 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2cb1832-d60b-4124-bae2-26882e380037-kube-api-access-ck5v8" (OuterVolumeSpecName: "kube-api-access-ck5v8") pod "d2cb1832-d60b-4124-bae2-26882e380037" (UID: "d2cb1832-d60b-4124-bae2-26882e380037"). InnerVolumeSpecName "kube-api-access-ck5v8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:51:39.051865 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.051846 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d2cb1832-d60b-4124-bae2-26882e380037" (UID: "d2cb1832-d60b-4124-bae2-26882e380037"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:51:39.051921 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.051879 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d2cb1832-d60b-4124-bae2-26882e380037" (UID: "d2cb1832-d60b-4124-bae2-26882e380037"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:51:39.150417 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.150360 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-trusted-ca-bundle\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:39.150417 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.150417 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-console-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:39.150592 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.150432 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-service-ca\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:39.150592 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.150445 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:39.150592 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.150457 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2cb1832-d60b-4124-bae2-26882e380037-oauth-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:39.150592 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.150469 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2cb1832-d60b-4124-bae2-26882e380037-console-oauth-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:39.150592 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.150498 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ck5v8\" (UniqueName: \"kubernetes.io/projected/d2cb1832-d60b-4124-bae2-26882e380037-kube-api-access-ck5v8\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:51:39.800616 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.800587 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56bd7886c-mnnsv_d2cb1832-d60b-4124-bae2-26882e380037/console/0.log" Apr 23 08:51:39.801015 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.800627 2574 generic.go:358] "Generic (PLEG): container finished" podID="d2cb1832-d60b-4124-bae2-26882e380037" containerID="c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180" exitCode=2 Apr 23 08:51:39.801015 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.800697 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56bd7886c-mnnsv" event={"ID":"d2cb1832-d60b-4124-bae2-26882e380037","Type":"ContainerDied","Data":"c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180"} Apr 23 08:51:39.801015 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.800712 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56bd7886c-mnnsv" Apr 23 08:51:39.801015 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.800730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56bd7886c-mnnsv" event={"ID":"d2cb1832-d60b-4124-bae2-26882e380037","Type":"ContainerDied","Data":"efa1de92e692605e5418b1f332d9e253ae95feb7f738ec578320d802885167c7"} Apr 23 08:51:39.801015 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.800747 2574 scope.go:117] "RemoveContainer" containerID="c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180" Apr 23 08:51:39.809905 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.809886 2574 scope.go:117] "RemoveContainer" containerID="c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180" Apr 23 08:51:39.810138 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:51:39.810122 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180\": container with ID starting with c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180 not found: ID does not exist" containerID="c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180" Apr 23 08:51:39.810192 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.810143 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180"} err="failed to get container status \"c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180\": rpc error: code = NotFound desc = could not find container \"c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180\": container with ID starting with c699c64f409b53885198105f9c1ee0af248ff3ef5cf14924c5390c7089131180 not found: ID does not exist" Apr 23 08:51:39.823710 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.823684 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56bd7886c-mnnsv"] Apr 23 08:51:39.831656 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:39.831636 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56bd7886c-mnnsv"] Apr 23 08:51:41.069839 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:51:41.069810 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2cb1832-d60b-4124-bae2-26882e380037" path="/var/lib/kubelet/pods/d2cb1832-d60b-4124-bae2-26882e380037/volumes" Apr 23 08:52:11.168373 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168343 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4"] Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168670 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2cb1832-d60b-4124-bae2-26882e380037" containerName="console" Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168681 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2cb1832-d60b-4124-bae2-26882e380037" containerName="console" Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168695 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fbfde04-c427-426f-95d3-93915990fa2a" containerName="console" Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168700 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbfde04-c427-426f-95d3-93915990fa2a" containerName="console" Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168711 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92f47ad8-406e-4c78-b8b6-7565d305e19d" containerName="registry" Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168716 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f47ad8-406e-4c78-b8b6-7565d305e19d" containerName="registry" Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168768 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="92f47ad8-406e-4c78-b8b6-7565d305e19d" containerName="registry" Apr 23 08:52:11.168774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168776 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fbfde04-c427-426f-95d3-93915990fa2a" containerName="console" Apr 23 08:52:11.169007 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.168784 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2cb1832-d60b-4124-bae2-26882e380037" containerName="console" Apr 23 08:52:11.173418 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.173376 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.176246 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.176221 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 08:52:11.176370 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.176305 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xhxg2\"" Apr 23 08:52:11.176370 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.176316 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 08:52:11.176749 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.176736 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 08:52:11.176791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.176738 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 08:52:11.180658 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.180635 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 08:52:11.183703 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.183679 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4"] Apr 23 08:52:11.184774 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.184758 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 08:52:11.312709 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.312672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.312877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.312727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.312877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.312794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-metrics-client-ca\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.312877 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.312855 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsc8r\" (UniqueName: \"kubernetes.io/projected/4aa32170-88fe-4329-9507-487a81e56002-kube-api-access-hsc8r\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.312980 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.312911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-serving-certs-ca-bundle\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.312980 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.312930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-federate-client-tls\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.312980 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.312949 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-telemeter-client-tls\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.313070 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.313022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-secret-telemeter-client\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.413638 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.413588 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-metrics-client-ca\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.413638 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.413644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsc8r\" (UniqueName: \"kubernetes.io/projected/4aa32170-88fe-4329-9507-487a81e56002-kube-api-access-hsc8r\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.413854 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.413691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-serving-certs-ca-bundle\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.413854 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.413713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-federate-client-tls\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.413854 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.413730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-telemeter-client-tls\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.414001 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.413892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-secret-telemeter-client\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.414001 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.413959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.414092 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.414035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.414729 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.414619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-metrics-client-ca\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.414729 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.414746 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-serving-certs-ca-bundle\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.415195 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.415170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aa32170-88fe-4329-9507-487a81e56002-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.416482 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.416461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-federate-client-tls\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.416581 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.416492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-telemeter-client-tls\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.416581 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.416537 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.416683 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.416609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4aa32170-88fe-4329-9507-487a81e56002-secret-telemeter-client\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.424222 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.424159 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsc8r\" (UniqueName: \"kubernetes.io/projected/4aa32170-88fe-4329-9507-487a81e56002-kube-api-access-hsc8r\") pod \"telemeter-client-59b68b5fdf-d2cb4\" (UID: \"4aa32170-88fe-4329-9507-487a81e56002\") " pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.482803 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.482767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" Apr 23 08:52:11.631621 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.631595 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4"] Apr 23 08:52:11.634372 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:52:11.634341 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa32170_88fe_4329_9507_487a81e56002.slice/crio-4976c1fd314e28e9fb3044e947d07a8c0b0152b8fd758d77b39efaa3390ce58f WatchSource:0}: Error finding container 4976c1fd314e28e9fb3044e947d07a8c0b0152b8fd758d77b39efaa3390ce58f: Status 404 returned error can't find the container with id 4976c1fd314e28e9fb3044e947d07a8c0b0152b8fd758d77b39efaa3390ce58f Apr 23 08:52:11.896770 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:11.896733 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" event={"ID":"4aa32170-88fe-4329-9507-487a81e56002","Type":"ContainerStarted","Data":"4976c1fd314e28e9fb3044e947d07a8c0b0152b8fd758d77b39efaa3390ce58f"} Apr 23 08:52:13.905268 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:13.905220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" event={"ID":"4aa32170-88fe-4329-9507-487a81e56002","Type":"ContainerStarted","Data":"70e2029b34ce7c64aee3b979425d2596e42ac354d268d16e110d3833a32d7186"} Apr 23 08:52:14.910433 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:14.910375 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" event={"ID":"4aa32170-88fe-4329-9507-487a81e56002","Type":"ContainerStarted","Data":"444ff37b29023c5b8765e533a1acfc8c7b612430a3f5786288a9dc22bfde7a65"} Apr 23 08:52:14.910433 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:14.910434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" event={"ID":"4aa32170-88fe-4329-9507-487a81e56002","Type":"ContainerStarted","Data":"1f8e971f3539ddace838221227daa710672b6eb5a12c38d5ff0b471c8ec1cbf7"} Apr 23 08:52:14.940942 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:14.940903 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-59b68b5fdf-d2cb4" podStartSLOduration=1.7494879220000001 podStartE2EDuration="3.940890188s" podCreationTimestamp="2026-04-23 08:52:11 +0000 UTC" firstStartedPulling="2026-04-23 08:52:11.63652903 +0000 UTC m=+243.153388344" lastFinishedPulling="2026-04-23 08:52:13.827931294 +0000 UTC m=+245.344790610" observedRunningTime="2026-04-23 08:52:14.939068547 +0000 UTC m=+246.455927884" watchObservedRunningTime="2026-04-23 08:52:14.940890188 +0000 UTC m=+246.457749522" Apr 23 08:52:15.585614 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.585579 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68d6499b7d-fjtrh"] Apr 23 08:52:15.589111 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.589089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.603546 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.603519 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68d6499b7d-fjtrh"] Apr 23 08:52:15.754592 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.754557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-serving-cert\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.754749 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.754611 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgsj4\" (UniqueName: \"kubernetes.io/projected/9c514a86-f313-4759-9b87-d8cc034f390c-kube-api-access-mgsj4\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.754749 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.754689 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-trusted-ca-bundle\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.754749 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.754721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-console-config\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.754749 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.754744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-oauth-serving-cert\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.754891 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.754810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-oauth-config\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.754891 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.754835 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-service-ca\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856128 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgsj4\" (UniqueName: \"kubernetes.io/projected/9c514a86-f313-4759-9b87-d8cc034f390c-kube-api-access-mgsj4\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856128 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-trusted-ca-bundle\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856256 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-console-config\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856357 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-oauth-serving-cert\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856428 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-oauth-config\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856467 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-service-ca\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856520 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856484 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-serving-cert\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.856938 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.856910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-console-config\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.857036 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.857008 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-oauth-serving-cert\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.857073 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.857028 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-service-ca\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.857073 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.857048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-trusted-ca-bundle\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.858913 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.858884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-serving-cert\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.859008 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.858945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-oauth-config\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.867370 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.867342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgsj4\" (UniqueName: \"kubernetes.io/projected/9c514a86-f313-4759-9b87-d8cc034f390c-kube-api-access-mgsj4\") pod \"console-68d6499b7d-fjtrh\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:15.898366 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:15.898339 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:16.021769 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:16.021740 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68d6499b7d-fjtrh"] Apr 23 08:52:16.024723 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:52:16.024692 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c514a86_f313_4759_9b87_d8cc034f390c.slice/crio-5e0c8132dfe7242dd77d410c83be586bebef73fef30617c450ca89df1dd850d7 WatchSource:0}: Error finding container 5e0c8132dfe7242dd77d410c83be586bebef73fef30617c450ca89df1dd850d7: Status 404 returned error can't find the container with id 5e0c8132dfe7242dd77d410c83be586bebef73fef30617c450ca89df1dd850d7 Apr 23 08:52:16.917679 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:16.917641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d6499b7d-fjtrh" event={"ID":"9c514a86-f313-4759-9b87-d8cc034f390c","Type":"ContainerStarted","Data":"37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22"} Apr 23 08:52:16.917679 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:16.917677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d6499b7d-fjtrh" event={"ID":"9c514a86-f313-4759-9b87-d8cc034f390c","Type":"ContainerStarted","Data":"5e0c8132dfe7242dd77d410c83be586bebef73fef30617c450ca89df1dd850d7"} Apr 23 08:52:16.939451 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:16.939376 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68d6499b7d-fjtrh" podStartSLOduration=1.939363022 podStartE2EDuration="1.939363022s" podCreationTimestamp="2026-04-23 08:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:52:16.938538096 +0000 UTC m=+248.455397431" watchObservedRunningTime="2026-04-23 08:52:16.939363022 +0000 UTC m=+248.456222336" Apr 23 08:52:20.801187 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:20.801157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:52:20.803669 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:20.803644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b87e8c66-90ff-454c-9c82-3fe28797e8df-metrics-certs\") pod \"network-metrics-daemon-wnmff\" (UID: \"b87e8c66-90ff-454c-9c82-3fe28797e8df\") " pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:52:20.869508 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:20.869485 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zffjs\"" Apr 23 08:52:20.876831 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:20.876812 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wnmff" Apr 23 08:52:20.998267 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:20.998241 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wnmff"] Apr 23 08:52:21.001067 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:52:21.001042 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb87e8c66_90ff_454c_9c82_3fe28797e8df.slice/crio-7542f154955648cd6bc663a418157004639bc74db86ad829d733225683fa90df WatchSource:0}: Error finding container 7542f154955648cd6bc663a418157004639bc74db86ad829d733225683fa90df: Status 404 returned error can't find the container with id 7542f154955648cd6bc663a418157004639bc74db86ad829d733225683fa90df Apr 23 08:52:21.935867 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:21.935824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wnmff" event={"ID":"b87e8c66-90ff-454c-9c82-3fe28797e8df","Type":"ContainerStarted","Data":"7542f154955648cd6bc663a418157004639bc74db86ad829d733225683fa90df"} Apr 23 08:52:22.940405 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:22.940352 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wnmff" event={"ID":"b87e8c66-90ff-454c-9c82-3fe28797e8df","Type":"ContainerStarted","Data":"baae2ea4dca4d2d6b3dfbf9b10a36171b1f721923212056c4a433971cbe11c12"} Apr 23 08:52:22.940405 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:22.940410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wnmff" event={"ID":"b87e8c66-90ff-454c-9c82-3fe28797e8df","Type":"ContainerStarted","Data":"e2571fc39ec47f0575ad8d482efc3346c5f37cac4517f007101502364e8cfda9"} Apr 23 08:52:22.960199 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:22.960157 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wnmff" podStartSLOduration=253.068346216 podStartE2EDuration="4m13.960145701s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:52:21.002718813 +0000 UTC m=+252.519578128" lastFinishedPulling="2026-04-23 08:52:21.894518299 +0000 UTC m=+253.411377613" observedRunningTime="2026-04-23 08:52:22.958913463 +0000 UTC m=+254.475772795" watchObservedRunningTime="2026-04-23 08:52:22.960145701 +0000 UTC m=+254.477005036" Apr 23 08:52:25.899222 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:25.899146 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:25.899222 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:25.899189 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:25.903917 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:25.903894 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:25.954312 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:25.954287 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:52:26.015027 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.014993 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699bc9f49f-v54rl"] Apr 23 08:52:26.162160 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.162081 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bc9557b5f-krddp"] Apr 23 08:52:26.167380 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.167359 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.177911 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.177881 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc9557b5f-krddp"] Apr 23 08:52:26.245777 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.245747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-oauth-config\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.245912 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.245791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-serving-cert\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.245912 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.245817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxq8\" (UniqueName: \"kubernetes.io/projected/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-kube-api-access-bnxq8\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.245912 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.245855 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-config\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.245912 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.245874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-oauth-serving-cert\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.246039 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.245934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-service-ca\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.246039 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.245968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-trusted-ca-bundle\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347131 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-serving-cert\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347131 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxq8\" (UniqueName: \"kubernetes.io/projected/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-kube-api-access-bnxq8\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347305 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-config\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347370 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-oauth-serving-cert\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347456 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-service-ca\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347517 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347472 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-trusted-ca-bundle\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347650 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347600 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-oauth-config\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.347935 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.347902 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-config\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.348035 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.348012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-oauth-serving-cert\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.348173 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.348150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-service-ca\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.348701 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.348680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-trusted-ca-bundle\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.349985 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.349969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-serving-cert\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.350151 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.350129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-oauth-config\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.356110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.356088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxq8\" (UniqueName: \"kubernetes.io/projected/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-kube-api-access-bnxq8\") pod \"console-bc9557b5f-krddp\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.476494 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.476401 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:26.608540 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.608498 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc9557b5f-krddp"] Apr 23 08:52:26.611526 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:52:26.611496 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9dcdc7_2c30_4e50_bca9_015e477af7e8.slice/crio-199a0fabd24855daa18958eb36e821dc2b3265e9a9955adc4d01f94138e0b5e4 WatchSource:0}: Error finding container 199a0fabd24855daa18958eb36e821dc2b3265e9a9955adc4d01f94138e0b5e4: Status 404 returned error can't find the container with id 199a0fabd24855daa18958eb36e821dc2b3265e9a9955adc4d01f94138e0b5e4 Apr 23 08:52:26.955304 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.955269 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc9557b5f-krddp" event={"ID":"fc9dcdc7-2c30-4e50-bca9-015e477af7e8","Type":"ContainerStarted","Data":"de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8"} Apr 23 08:52:26.955304 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.955306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc9557b5f-krddp" event={"ID":"fc9dcdc7-2c30-4e50-bca9-015e477af7e8","Type":"ContainerStarted","Data":"199a0fabd24855daa18958eb36e821dc2b3265e9a9955adc4d01f94138e0b5e4"} Apr 23 08:52:26.977095 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:26.977050 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bc9557b5f-krddp" podStartSLOduration=0.977036119 podStartE2EDuration="977.036119ms" podCreationTimestamp="2026-04-23 08:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:52:26.975624721 +0000 UTC m=+258.492484057" watchObservedRunningTime="2026-04-23 08:52:26.977036119 +0000 UTC m=+258.493895447" Apr 23 08:52:36.476948 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:36.476905 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:36.477438 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:36.476965 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:36.481646 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:36.481624 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:36.989628 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:36.989603 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:52:37.046637 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:37.046608 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68d6499b7d-fjtrh"] Apr 23 08:52:51.039055 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.039000 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-699bc9f49f-v54rl" podUID="24bf8987-c434-4d6c-896d-3e931cd9807a" containerName="console" containerID="cri-o://cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff" gracePeriod=15 Apr 23 08:52:51.285368 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.285341 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699bc9f49f-v54rl_24bf8987-c434-4d6c-896d-3e931cd9807a/console/0.log" Apr 23 08:52:51.285493 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.285428 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:52:51.354223 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354195 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tfkv\" (UniqueName: \"kubernetes.io/projected/24bf8987-c434-4d6c-896d-3e931cd9807a-kube-api-access-6tfkv\") pod \"24bf8987-c434-4d6c-896d-3e931cd9807a\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " Apr 23 08:52:51.354364 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354248 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-oauth-serving-cert\") pod \"24bf8987-c434-4d6c-896d-3e931cd9807a\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " Apr 23 08:52:51.354364 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354282 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-oauth-config\") pod \"24bf8987-c434-4d6c-896d-3e931cd9807a\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " Apr 23 08:52:51.354364 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354305 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-service-ca\") pod \"24bf8987-c434-4d6c-896d-3e931cd9807a\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " Apr 23 08:52:51.354364 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354335 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-trusted-ca-bundle\") pod \"24bf8987-c434-4d6c-896d-3e931cd9807a\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " Apr 23 08:52:51.354579 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354383 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-console-config\") pod \"24bf8987-c434-4d6c-896d-3e931cd9807a\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " Apr 23 08:52:51.354579 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354452 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-serving-cert\") pod \"24bf8987-c434-4d6c-896d-3e931cd9807a\" (UID: \"24bf8987-c434-4d6c-896d-3e931cd9807a\") " Apr 23 08:52:51.354851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354802 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "24bf8987-c434-4d6c-896d-3e931cd9807a" (UID: "24bf8987-c434-4d6c-896d-3e931cd9807a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:52:51.354851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354827 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-service-ca" (OuterVolumeSpecName: "service-ca") pod "24bf8987-c434-4d6c-896d-3e931cd9807a" (UID: "24bf8987-c434-4d6c-896d-3e931cd9807a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:52:51.355199 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354939 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-console-config" (OuterVolumeSpecName: "console-config") pod "24bf8987-c434-4d6c-896d-3e931cd9807a" (UID: "24bf8987-c434-4d6c-896d-3e931cd9807a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:52:51.355199 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.354981 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "24bf8987-c434-4d6c-896d-3e931cd9807a" (UID: "24bf8987-c434-4d6c-896d-3e931cd9807a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:52:51.356572 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.356539 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "24bf8987-c434-4d6c-896d-3e931cd9807a" (UID: "24bf8987-c434-4d6c-896d-3e931cd9807a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:52:51.356657 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.356589 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bf8987-c434-4d6c-896d-3e931cd9807a-kube-api-access-6tfkv" (OuterVolumeSpecName: "kube-api-access-6tfkv") pod "24bf8987-c434-4d6c-896d-3e931cd9807a" (UID: "24bf8987-c434-4d6c-896d-3e931cd9807a"). InnerVolumeSpecName "kube-api-access-6tfkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:52:51.356862 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.356845 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "24bf8987-c434-4d6c-896d-3e931cd9807a" (UID: "24bf8987-c434-4d6c-896d-3e931cd9807a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:52:51.455325 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.455289 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:52:51.455325 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.455321 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6tfkv\" (UniqueName: \"kubernetes.io/projected/24bf8987-c434-4d6c-896d-3e931cd9807a-kube-api-access-6tfkv\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:52:51.455325 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.455330 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-oauth-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:52:51.455594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.455338 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24bf8987-c434-4d6c-896d-3e931cd9807a-console-oauth-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:52:51.455594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.455355 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-service-ca\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:52:51.455594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.455365 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-trusted-ca-bundle\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:52:51.455594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:51.455373 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24bf8987-c434-4d6c-896d-3e931cd9807a-console-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:52:52.034831 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.034807 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699bc9f49f-v54rl_24bf8987-c434-4d6c-896d-3e931cd9807a/console/0.log" Apr 23 08:52:52.034992 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.034843 2574 generic.go:358] "Generic (PLEG): container finished" podID="24bf8987-c434-4d6c-896d-3e931cd9807a" containerID="cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff" exitCode=2 Apr 23 08:52:52.034992 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.034913 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699bc9f49f-v54rl" Apr 23 08:52:52.035068 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.034916 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699bc9f49f-v54rl" event={"ID":"24bf8987-c434-4d6c-896d-3e931cd9807a","Type":"ContainerDied","Data":"cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff"} Apr 23 08:52:52.035068 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.035014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699bc9f49f-v54rl" event={"ID":"24bf8987-c434-4d6c-896d-3e931cd9807a","Type":"ContainerDied","Data":"90b0ec22d1b40fbcdc66b64eac8bb2998ed569706ff2191bbb86e4cc51c7a49c"} Apr 23 08:52:52.035068 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.035029 2574 scope.go:117] "RemoveContainer" containerID="cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff" Apr 23 08:52:52.043707 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.043585 2574 scope.go:117] "RemoveContainer" containerID="cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff" Apr 23 08:52:52.043906 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:52:52.043838 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff\": container with ID starting with cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff not found: ID does not exist" containerID="cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff" Apr 23 08:52:52.043906 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.043862 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff"} err="failed to get container status \"cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff\": rpc error: code = NotFound desc = could not find container \"cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff\": container with ID starting with cf299446f867205acabb6409429fe10ea563522f6cd9e457d83471a7fd580fff not found: ID does not exist" Apr 23 08:52:52.057259 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.057239 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699bc9f49f-v54rl"] Apr 23 08:52:52.061713 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:52.061682 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-699bc9f49f-v54rl"] Apr 23 08:52:53.069640 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:52:53.069608 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bf8987-c434-4d6c-896d-3e931cd9807a" path="/var/lib/kubelet/pods/24bf8987-c434-4d6c-896d-3e931cd9807a/volumes" Apr 23 08:53:02.070105 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.070046 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68d6499b7d-fjtrh" podUID="9c514a86-f313-4759-9b87-d8cc034f390c" containerName="console" containerID="cri-o://37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22" gracePeriod=15 Apr 23 08:53:02.310615 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.310594 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68d6499b7d-fjtrh_9c514a86-f313-4759-9b87-d8cc034f390c/console/0.log" Apr 23 08:53:02.310727 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.310652 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:53:02.445826 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.445784 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-service-ca\") pod \"9c514a86-f313-4759-9b87-d8cc034f390c\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " Apr 23 08:53:02.445826 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.445829 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-serving-cert\") pod \"9c514a86-f313-4759-9b87-d8cc034f390c\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " Apr 23 08:53:02.446092 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.445851 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-trusted-ca-bundle\") pod \"9c514a86-f313-4759-9b87-d8cc034f390c\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " Apr 23 08:53:02.446092 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.445876 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-oauth-serving-cert\") pod \"9c514a86-f313-4759-9b87-d8cc034f390c\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " Apr 23 08:53:02.446092 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.445919 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgsj4\" (UniqueName: \"kubernetes.io/projected/9c514a86-f313-4759-9b87-d8cc034f390c-kube-api-access-mgsj4\") pod \"9c514a86-f313-4759-9b87-d8cc034f390c\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " Apr 23 08:53:02.446092 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.445938 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-console-config\") pod \"9c514a86-f313-4759-9b87-d8cc034f390c\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " Apr 23 08:53:02.446092 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.445967 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-oauth-config\") pod \"9c514a86-f313-4759-9b87-d8cc034f390c\" (UID: \"9c514a86-f313-4759-9b87-d8cc034f390c\") " Apr 23 08:53:02.446332 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.446245 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-service-ca" (OuterVolumeSpecName: "service-ca") pod "9c514a86-f313-4759-9b87-d8cc034f390c" (UID: "9c514a86-f313-4759-9b87-d8cc034f390c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:02.446423 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.446366 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9c514a86-f313-4759-9b87-d8cc034f390c" (UID: "9c514a86-f313-4759-9b87-d8cc034f390c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:02.446423 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.446412 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9c514a86-f313-4759-9b87-d8cc034f390c" (UID: "9c514a86-f313-4759-9b87-d8cc034f390c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:02.446536 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.446504 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-console-config" (OuterVolumeSpecName: "console-config") pod "9c514a86-f313-4759-9b87-d8cc034f390c" (UID: "9c514a86-f313-4759-9b87-d8cc034f390c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:53:02.448199 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.448172 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9c514a86-f313-4759-9b87-d8cc034f390c" (UID: "9c514a86-f313-4759-9b87-d8cc034f390c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:53:02.448298 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.448205 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9c514a86-f313-4759-9b87-d8cc034f390c" (UID: "9c514a86-f313-4759-9b87-d8cc034f390c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:53:02.448298 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.448246 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c514a86-f313-4759-9b87-d8cc034f390c-kube-api-access-mgsj4" (OuterVolumeSpecName: "kube-api-access-mgsj4") pod "9c514a86-f313-4759-9b87-d8cc034f390c" (UID: "9c514a86-f313-4759-9b87-d8cc034f390c"). InnerVolumeSpecName "kube-api-access-mgsj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:53:02.547489 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.547459 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgsj4\" (UniqueName: \"kubernetes.io/projected/9c514a86-f313-4759-9b87-d8cc034f390c-kube-api-access-mgsj4\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:53:02.547489 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.547485 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-console-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:53:02.547636 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.547494 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-oauth-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:53:02.547636 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.547504 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-service-ca\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:53:02.547636 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.547513 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c514a86-f313-4759-9b87-d8cc034f390c-console-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:53:02.547636 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.547521 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-trusted-ca-bundle\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:53:02.547636 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:02.547531 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c514a86-f313-4759-9b87-d8cc034f390c-oauth-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:53:03.068128 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.068101 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68d6499b7d-fjtrh_9c514a86-f313-4759-9b87-d8cc034f390c/console/0.log" Apr 23 08:53:03.068303 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.068142 2574 generic.go:358] "Generic (PLEG): container finished" podID="9c514a86-f313-4759-9b87-d8cc034f390c" containerID="37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22" exitCode=2 Apr 23 08:53:03.068303 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.068282 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d6499b7d-fjtrh" Apr 23 08:53:03.069342 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.069315 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d6499b7d-fjtrh" event={"ID":"9c514a86-f313-4759-9b87-d8cc034f390c","Type":"ContainerDied","Data":"37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22"} Apr 23 08:53:03.069342 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.069341 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d6499b7d-fjtrh" event={"ID":"9c514a86-f313-4759-9b87-d8cc034f390c","Type":"ContainerDied","Data":"5e0c8132dfe7242dd77d410c83be586bebef73fef30617c450ca89df1dd850d7"} Apr 23 08:53:03.069526 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.069357 2574 scope.go:117] "RemoveContainer" containerID="37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22" Apr 23 08:53:03.078647 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.078498 2574 scope.go:117] "RemoveContainer" containerID="37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22" Apr 23 08:53:03.078847 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:53:03.078764 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22\": container with ID starting with 37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22 not found: ID does not exist" containerID="37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22" Apr 23 08:53:03.078847 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.078787 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22"} err="failed to get container status \"37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22\": rpc error: code = NotFound desc = could not find container \"37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22\": container with ID starting with 37df46f9af0bd20152fa73cd55bb88c94df5ad33dddcfd54bf934a9838f3ca22 not found: ID does not exist" Apr 23 08:53:03.093910 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.093887 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68d6499b7d-fjtrh"] Apr 23 08:53:03.097722 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:03.097703 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68d6499b7d-fjtrh"] Apr 23 08:53:05.070335 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:05.070297 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c514a86-f313-4759-9b87-d8cc034f390c" path="/var/lib/kubelet/pods/9c514a86-f313-4759-9b87-d8cc034f390c/volumes" Apr 23 08:53:08.953687 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:08.953660 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:53:08.954105 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:08.953757 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:53:08.959554 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:08.959533 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:53:21.863720 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.863682 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qtlms"] Apr 23 08:53:21.866194 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.864153 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24bf8987-c434-4d6c-896d-3e931cd9807a" containerName="console" Apr 23 08:53:21.866194 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.864170 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bf8987-c434-4d6c-896d-3e931cd9807a" containerName="console" Apr 23 08:53:21.866194 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.864205 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c514a86-f313-4759-9b87-d8cc034f390c" containerName="console" Apr 23 08:53:21.866194 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.864212 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c514a86-f313-4759-9b87-d8cc034f390c" containerName="console" Apr 23 08:53:21.866194 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.864269 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c514a86-f313-4759-9b87-d8cc034f390c" containerName="console" Apr 23 08:53:21.866194 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.864284 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="24bf8987-c434-4d6c-896d-3e931cd9807a" containerName="console" Apr 23 08:53:21.867166 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.867149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:21.869899 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.869882 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:53:21.875972 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.875952 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qtlms"] Apr 23 08:53:21.906116 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.906090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1ee7a30a-d0ec-443c-89c2-577e277215a1-dbus\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:21.906245 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.906132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1ee7a30a-d0ec-443c-89c2-577e277215a1-kubelet-config\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:21.906245 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:21.906212 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1ee7a30a-d0ec-443c-89c2-577e277215a1-original-pull-secret\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.007269 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.007231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1ee7a30a-d0ec-443c-89c2-577e277215a1-original-pull-secret\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.007486 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.007335 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1ee7a30a-d0ec-443c-89c2-577e277215a1-dbus\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.007486 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.007376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1ee7a30a-d0ec-443c-89c2-577e277215a1-kubelet-config\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.007576 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.007521 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1ee7a30a-d0ec-443c-89c2-577e277215a1-dbus\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.007576 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.007525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1ee7a30a-d0ec-443c-89c2-577e277215a1-kubelet-config\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.009685 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.009666 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1ee7a30a-d0ec-443c-89c2-577e277215a1-original-pull-secret\") pod \"global-pull-secret-syncer-qtlms\" (UID: \"1ee7a30a-d0ec-443c-89c2-577e277215a1\") " pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.177594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.177511 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qtlms" Apr 23 08:53:22.296011 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.295938 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qtlms"] Apr 23 08:53:22.298846 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:53:22.298814 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee7a30a_d0ec_443c_89c2_577e277215a1.slice/crio-dae1d44ef1de19675c13a645d56f33956aa18fb85d9d03b4b216f95dd0a1e06a WatchSource:0}: Error finding container dae1d44ef1de19675c13a645d56f33956aa18fb85d9d03b4b216f95dd0a1e06a: Status 404 returned error can't find the container with id dae1d44ef1de19675c13a645d56f33956aa18fb85d9d03b4b216f95dd0a1e06a Apr 23 08:53:22.300279 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:22.300263 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:53:23.130028 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:23.129988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qtlms" event={"ID":"1ee7a30a-d0ec-443c-89c2-577e277215a1","Type":"ContainerStarted","Data":"dae1d44ef1de19675c13a645d56f33956aa18fb85d9d03b4b216f95dd0a1e06a"} Apr 23 08:53:53.218791 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:53.218757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qtlms" event={"ID":"1ee7a30a-d0ec-443c-89c2-577e277215a1","Type":"ContainerStarted","Data":"6f48cc26215c57f281324dc3869ce5be53924dd6f77a7f6aa3ddfd2134ab0133"} Apr 23 08:53:53.235512 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:53:53.235463 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qtlms" podStartSLOduration=1.955602574 podStartE2EDuration="32.235448865s" podCreationTimestamp="2026-04-23 08:53:21 +0000 UTC" firstStartedPulling="2026-04-23 08:53:22.300414257 +0000 UTC m=+313.817273570" lastFinishedPulling="2026-04-23 08:53:52.580260531 +0000 UTC m=+344.097119861" observedRunningTime="2026-04-23 08:53:53.234352808 +0000 UTC m=+344.751212143" watchObservedRunningTime="2026-04-23 08:53:53.235448865 +0000 UTC m=+344.752308200" Apr 23 08:54:01.239507 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.239422 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n"] Apr 23 08:54:01.243025 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.243005 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.245886 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.245866 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 08:54:01.245886 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.245879 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 08:54:01.247124 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.247104 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tzgsl\"" Apr 23 08:54:01.254098 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.254077 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n"] Apr 23 08:54:01.319358 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.319317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.319497 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.319438 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pfd2\" (UniqueName: \"kubernetes.io/projected/38fc2d0b-8440-48bc-ae64-83868fb08707-kube-api-access-7pfd2\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.319497 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.319472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.420171 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.420129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pfd2\" (UniqueName: \"kubernetes.io/projected/38fc2d0b-8440-48bc-ae64-83868fb08707-kube-api-access-7pfd2\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.420334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.420198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.420334 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.420264 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.420638 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.420622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.420677 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.420651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.430121 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.430089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pfd2\" (UniqueName: \"kubernetes.io/projected/38fc2d0b-8440-48bc-ae64-83868fb08707-kube-api-access-7pfd2\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.551935 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.551851 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:01.677594 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:01.677560 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n"] Apr 23 08:54:01.680468 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:54:01.680438 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fc2d0b_8440_48bc_ae64_83868fb08707.slice/crio-abc4d8a1f999b85e3f41fb85fe42572b59ec2bf493bd924ef89462b44eb4da96 WatchSource:0}: Error finding container abc4d8a1f999b85e3f41fb85fe42572b59ec2bf493bd924ef89462b44eb4da96: Status 404 returned error can't find the container with id abc4d8a1f999b85e3f41fb85fe42572b59ec2bf493bd924ef89462b44eb4da96 Apr 23 08:54:02.245264 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:02.245231 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" event={"ID":"38fc2d0b-8440-48bc-ae64-83868fb08707","Type":"ContainerStarted","Data":"abc4d8a1f999b85e3f41fb85fe42572b59ec2bf493bd924ef89462b44eb4da96"} Apr 23 08:54:09.268853 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:09.268821 2574 generic.go:358] "Generic (PLEG): container finished" podID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerID="62ab4b750df2e56eb1deb95bbe52c836028f472839532a84ed21cb93f459d7e9" exitCode=0 Apr 23 08:54:09.269312 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:09.268924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" event={"ID":"38fc2d0b-8440-48bc-ae64-83868fb08707","Type":"ContainerDied","Data":"62ab4b750df2e56eb1deb95bbe52c836028f472839532a84ed21cb93f459d7e9"} Apr 23 08:54:11.277481 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:11.277437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" event={"ID":"38fc2d0b-8440-48bc-ae64-83868fb08707","Type":"ContainerStarted","Data":"aad15953b503c0fff874fffbda17394130fda94cb25b22a1d0454c5bb7661089"} Apr 23 08:54:12.282316 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:12.282279 2574 generic.go:358] "Generic (PLEG): container finished" podID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerID="aad15953b503c0fff874fffbda17394130fda94cb25b22a1d0454c5bb7661089" exitCode=0 Apr 23 08:54:12.282705 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:12.282360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" event={"ID":"38fc2d0b-8440-48bc-ae64-83868fb08707","Type":"ContainerDied","Data":"aad15953b503c0fff874fffbda17394130fda94cb25b22a1d0454c5bb7661089"} Apr 23 08:54:19.305967 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:19.305929 2574 generic.go:358] "Generic (PLEG): container finished" podID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerID="195ae957c77fcd2f71a2e1b01c73d57a375c1706e3f2a2ff6b5bf978eebd71a6" exitCode=0 Apr 23 08:54:19.306320 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:19.306021 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" event={"ID":"38fc2d0b-8440-48bc-ae64-83868fb08707","Type":"ContainerDied","Data":"195ae957c77fcd2f71a2e1b01c73d57a375c1706e3f2a2ff6b5bf978eebd71a6"} Apr 23 08:54:20.429901 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.429879 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:20.484568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.484537 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-bundle\") pod \"38fc2d0b-8440-48bc-ae64-83868fb08707\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " Apr 23 08:54:20.484568 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.484569 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-util\") pod \"38fc2d0b-8440-48bc-ae64-83868fb08707\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " Apr 23 08:54:20.484729 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.484624 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pfd2\" (UniqueName: \"kubernetes.io/projected/38fc2d0b-8440-48bc-ae64-83868fb08707-kube-api-access-7pfd2\") pod \"38fc2d0b-8440-48bc-ae64-83868fb08707\" (UID: \"38fc2d0b-8440-48bc-ae64-83868fb08707\") " Apr 23 08:54:20.485191 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.485167 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-bundle" (OuterVolumeSpecName: "bundle") pod "38fc2d0b-8440-48bc-ae64-83868fb08707" (UID: "38fc2d0b-8440-48bc-ae64-83868fb08707"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:54:20.486919 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.486897 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fc2d0b-8440-48bc-ae64-83868fb08707-kube-api-access-7pfd2" (OuterVolumeSpecName: "kube-api-access-7pfd2") pod "38fc2d0b-8440-48bc-ae64-83868fb08707" (UID: "38fc2d0b-8440-48bc-ae64-83868fb08707"). InnerVolumeSpecName "kube-api-access-7pfd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:54:20.489840 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.489816 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-util" (OuterVolumeSpecName: "util") pod "38fc2d0b-8440-48bc-ae64-83868fb08707" (UID: "38fc2d0b-8440-48bc-ae64-83868fb08707"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:54:20.585144 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.585121 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-bundle\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:54:20.585144 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.585144 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38fc2d0b-8440-48bc-ae64-83868fb08707-util\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:54:20.585271 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:20.585155 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7pfd2\" (UniqueName: \"kubernetes.io/projected/38fc2d0b-8440-48bc-ae64-83868fb08707-kube-api-access-7pfd2\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:54:21.314149 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:21.314107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" event={"ID":"38fc2d0b-8440-48bc-ae64-83868fb08707","Type":"ContainerDied","Data":"abc4d8a1f999b85e3f41fb85fe42572b59ec2bf493bd924ef89462b44eb4da96"} Apr 23 08:54:21.314149 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:21.314144 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc4d8a1f999b85e3f41fb85fe42572b59ec2bf493bd924ef89462b44eb4da96" Apr 23 08:54:21.314149 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:21.314146 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgnh9n" Apr 23 08:54:29.306434 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306397 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng"] Apr 23 08:54:29.306868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306727 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerName="util" Apr 23 08:54:29.306868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306738 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerName="util" Apr 23 08:54:29.306868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306749 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerName="extract" Apr 23 08:54:29.306868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306754 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerName="extract" Apr 23 08:54:29.306868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306767 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerName="pull" Apr 23 08:54:29.306868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306772 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerName="pull" Apr 23 08:54:29.306868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.306824 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="38fc2d0b-8440-48bc-ae64-83868fb08707" containerName="extract" Apr 23 08:54:29.309718 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.309693 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.312794 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.312757 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:54:29.312906 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.312795 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 08:54:29.312906 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.312874 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-l2fmn\"" Apr 23 08:54:29.322173 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.322152 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng"] Apr 23 08:54:29.358477 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.358449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27dd78e9-47e6-46ff-aacb-e9adfa384fc6-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-48bng\" (UID: \"27dd78e9-47e6-46ff-aacb-e9adfa384fc6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.358586 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.358480 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blgdk\" (UniqueName: \"kubernetes.io/projected/27dd78e9-47e6-46ff-aacb-e9adfa384fc6-kube-api-access-blgdk\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-48bng\" (UID: \"27dd78e9-47e6-46ff-aacb-e9adfa384fc6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.459214 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.459185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27dd78e9-47e6-46ff-aacb-e9adfa384fc6-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-48bng\" (UID: \"27dd78e9-47e6-46ff-aacb-e9adfa384fc6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.459354 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.459216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blgdk\" (UniqueName: \"kubernetes.io/projected/27dd78e9-47e6-46ff-aacb-e9adfa384fc6-kube-api-access-blgdk\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-48bng\" (UID: \"27dd78e9-47e6-46ff-aacb-e9adfa384fc6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.459611 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.459591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27dd78e9-47e6-46ff-aacb-e9adfa384fc6-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-48bng\" (UID: \"27dd78e9-47e6-46ff-aacb-e9adfa384fc6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.468735 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.468712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blgdk\" (UniqueName: \"kubernetes.io/projected/27dd78e9-47e6-46ff-aacb-e9adfa384fc6-kube-api-access-blgdk\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-48bng\" (UID: \"27dd78e9-47e6-46ff-aacb-e9adfa384fc6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.619145 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.619120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" Apr 23 08:54:29.747868 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:29.747843 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng"] Apr 23 08:54:29.749322 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:54:29.749298 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27dd78e9_47e6_46ff_aacb_e9adfa384fc6.slice/crio-d7f231a1787b98d54f7c13ab0f0fc352c1522379bbc802d69dffc240a3acafee WatchSource:0}: Error finding container d7f231a1787b98d54f7c13ab0f0fc352c1522379bbc802d69dffc240a3acafee: Status 404 returned error can't find the container with id d7f231a1787b98d54f7c13ab0f0fc352c1522379bbc802d69dffc240a3acafee Apr 23 08:54:30.343960 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:30.343928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" event={"ID":"27dd78e9-47e6-46ff-aacb-e9adfa384fc6","Type":"ContainerStarted","Data":"d7f231a1787b98d54f7c13ab0f0fc352c1522379bbc802d69dffc240a3acafee"} Apr 23 08:54:32.353193 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:32.353155 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" event={"ID":"27dd78e9-47e6-46ff-aacb-e9adfa384fc6","Type":"ContainerStarted","Data":"76cd2bda76941bc3e8b3979dceb556c2e4bc0ad4cdda02cbaa8bf9fcb81dd9b9"} Apr 23 08:54:32.380125 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:32.380085 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-48bng" podStartSLOduration=1.660988455 podStartE2EDuration="3.380071266s" podCreationTimestamp="2026-04-23 08:54:29 +0000 UTC" firstStartedPulling="2026-04-23 08:54:29.751704217 +0000 UTC m=+381.268563531" lastFinishedPulling="2026-04-23 08:54:31.470787025 +0000 UTC m=+382.987646342" observedRunningTime="2026-04-23 08:54:32.378311032 +0000 UTC m=+383.895170367" watchObservedRunningTime="2026-04-23 08:54:32.380071266 +0000 UTC m=+383.896930601" Apr 23 08:54:36.066348 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.066314 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8c7sk"] Apr 23 08:54:36.069719 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.069701 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.072790 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.072768 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 08:54:36.073945 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.073925 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-fbv64\"" Apr 23 08:54:36.074029 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.073942 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 08:54:36.080693 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.080676 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8c7sk"] Apr 23 08:54:36.116571 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.116545 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8dlg\" (UniqueName: \"kubernetes.io/projected/82c9d19a-0ad0-402b-bca8-df4e5e8fc633-kube-api-access-j8dlg\") pod \"cert-manager-cainjector-68b757865b-8c7sk\" (UID: \"82c9d19a-0ad0-402b-bca8-df4e5e8fc633\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.116693 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.116599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82c9d19a-0ad0-402b-bca8-df4e5e8fc633-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8c7sk\" (UID: \"82c9d19a-0ad0-402b-bca8-df4e5e8fc633\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.217237 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.217198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8dlg\" (UniqueName: \"kubernetes.io/projected/82c9d19a-0ad0-402b-bca8-df4e5e8fc633-kube-api-access-j8dlg\") pod \"cert-manager-cainjector-68b757865b-8c7sk\" (UID: \"82c9d19a-0ad0-402b-bca8-df4e5e8fc633\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.217423 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.217351 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82c9d19a-0ad0-402b-bca8-df4e5e8fc633-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8c7sk\" (UID: \"82c9d19a-0ad0-402b-bca8-df4e5e8fc633\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.230037 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.229997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8dlg\" (UniqueName: \"kubernetes.io/projected/82c9d19a-0ad0-402b-bca8-df4e5e8fc633-kube-api-access-j8dlg\") pod \"cert-manager-cainjector-68b757865b-8c7sk\" (UID: \"82c9d19a-0ad0-402b-bca8-df4e5e8fc633\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.230453 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.230418 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82c9d19a-0ad0-402b-bca8-df4e5e8fc633-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8c7sk\" (UID: \"82c9d19a-0ad0-402b-bca8-df4e5e8fc633\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.391203 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.391171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" Apr 23 08:54:36.512780 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:36.512746 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8c7sk"] Apr 23 08:54:36.516483 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:54:36.516452 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c9d19a_0ad0_402b_bca8_df4e5e8fc633.slice/crio-1eb88fe8df664ad8b4a46c75fcd3157dded78939ac561c13533316218104092f WatchSource:0}: Error finding container 1eb88fe8df664ad8b4a46c75fcd3157dded78939ac561c13533316218104092f: Status 404 returned error can't find the container with id 1eb88fe8df664ad8b4a46c75fcd3157dded78939ac561c13533316218104092f Apr 23 08:54:37.375963 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:37.375916 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" event={"ID":"82c9d19a-0ad0-402b-bca8-df4e5e8fc633","Type":"ContainerStarted","Data":"1eb88fe8df664ad8b4a46c75fcd3157dded78939ac561c13533316218104092f"} Apr 23 08:54:40.389467 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.389425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" event={"ID":"82c9d19a-0ad0-402b-bca8-df4e5e8fc633","Type":"ContainerStarted","Data":"b6625ef432aedee278bfc6fc8b07507cd01d39eb09b536223fd9250cb2680818"} Apr 23 08:54:40.408065 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.408014 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-8c7sk" podStartSLOduration=1.4235590550000001 podStartE2EDuration="4.408002063s" podCreationTimestamp="2026-04-23 08:54:36 +0000 UTC" firstStartedPulling="2026-04-23 08:54:36.518436611 +0000 UTC m=+388.035295925" lastFinishedPulling="2026-04-23 08:54:39.502879616 +0000 UTC m=+391.019738933" observedRunningTime="2026-04-23 08:54:40.406528278 +0000 UTC m=+391.923387614" watchObservedRunningTime="2026-04-23 08:54:40.408002063 +0000 UTC m=+391.924861398" Apr 23 08:54:40.440215 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.440187 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-mszpm"] Apr 23 08:54:40.469712 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.469684 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.473170 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.473142 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-sc8wl\"" Apr 23 08:54:40.480753 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.480732 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-mszpm"] Apr 23 08:54:40.558562 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.558539 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvpsl\" (UniqueName: \"kubernetes.io/projected/1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6-kube-api-access-nvpsl\") pod \"cert-manager-webhook-587ccfb98-mszpm\" (UID: \"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.558664 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.558572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-mszpm\" (UID: \"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.659863 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.659805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvpsl\" (UniqueName: \"kubernetes.io/projected/1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6-kube-api-access-nvpsl\") pod \"cert-manager-webhook-587ccfb98-mszpm\" (UID: \"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.659863 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.659838 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-mszpm\" (UID: \"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.674718 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.674695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvpsl\" (UniqueName: \"kubernetes.io/projected/1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6-kube-api-access-nvpsl\") pod \"cert-manager-webhook-587ccfb98-mszpm\" (UID: \"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.675044 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.675027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-mszpm\" (UID: \"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.779379 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.779342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:40.901860 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:40.901759 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-mszpm"] Apr 23 08:54:40.903892 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:54:40.903861 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b58b4d4_437b_4c71_95ac_ef9a9bb72fb6.slice/crio-44ec28c3c0223ac264c1e8215b1242f6ff47e7dc8fc39b6e4b7dd28c04379bb8 WatchSource:0}: Error finding container 44ec28c3c0223ac264c1e8215b1242f6ff47e7dc8fc39b6e4b7dd28c04379bb8: Status 404 returned error can't find the container with id 44ec28c3c0223ac264c1e8215b1242f6ff47e7dc8fc39b6e4b7dd28c04379bb8 Apr 23 08:54:41.394549 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:41.394510 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" event={"ID":"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6","Type":"ContainerStarted","Data":"1208add30636e48d01185eaa4b90f12971643958502f5aa1a45ee5038f2dd29c"} Apr 23 08:54:41.395016 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:41.394554 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" event={"ID":"1b58b4d4-437b-4c71-95ac-ef9a9bb72fb6","Type":"ContainerStarted","Data":"44ec28c3c0223ac264c1e8215b1242f6ff47e7dc8fc39b6e4b7dd28c04379bb8"} Apr 23 08:54:41.413978 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:41.413934 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" podStartSLOduration=1.413920396 podStartE2EDuration="1.413920396s" podCreationTimestamp="2026-04-23 08:54:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:54:41.412078688 +0000 UTC m=+392.928938035" watchObservedRunningTime="2026-04-23 08:54:41.413920396 +0000 UTC m=+392.930779731" Apr 23 08:54:42.398274 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:42.398245 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:48.404723 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:48.404696 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-mszpm" Apr 23 08:54:57.150300 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.150261 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t"] Apr 23 08:54:57.153705 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.153687 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.156703 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.156669 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 08:54:57.156703 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.156669 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tzgsl\"" Apr 23 08:54:57.157837 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.157801 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 08:54:57.162112 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.162091 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t"] Apr 23 08:54:57.305642 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.305607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.305827 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.305657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrjp\" (UniqueName: \"kubernetes.io/projected/b519ad8a-6e9c-4c15-a477-70837b26948a-kube-api-access-hnrjp\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.305827 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.305680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.406551 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.406476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrjp\" (UniqueName: \"kubernetes.io/projected/b519ad8a-6e9c-4c15-a477-70837b26948a-kube-api-access-hnrjp\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.406551 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.406511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.406695 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.406573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.406919 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.406904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.406959 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.406932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.415548 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.415525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrjp\" (UniqueName: \"kubernetes.io/projected/b519ad8a-6e9c-4c15-a477-70837b26948a-kube-api-access-hnrjp\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.464851 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.464820 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:54:57.588748 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:57.588725 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t"] Apr 23 08:54:57.590702 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:54:57.590673 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb519ad8a_6e9c_4c15_a477_70837b26948a.slice/crio-372540a6cc23ee6bc1c72efe5f59cdcbbd930c7ceb49c49f2bc7c8a69967ca7b WatchSource:0}: Error finding container 372540a6cc23ee6bc1c72efe5f59cdcbbd930c7ceb49c49f2bc7c8a69967ca7b: Status 404 returned error can't find the container with id 372540a6cc23ee6bc1c72efe5f59cdcbbd930c7ceb49c49f2bc7c8a69967ca7b Apr 23 08:54:58.454824 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:58.454788 2574 generic.go:358] "Generic (PLEG): container finished" podID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerID="f892a5525c5d0d4b49f5b2c13ac9866990c73ae2e45bb498b87df2b11793289e" exitCode=0 Apr 23 08:54:58.455177 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:58.454829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" event={"ID":"b519ad8a-6e9c-4c15-a477-70837b26948a","Type":"ContainerDied","Data":"f892a5525c5d0d4b49f5b2c13ac9866990c73ae2e45bb498b87df2b11793289e"} Apr 23 08:54:58.455177 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:54:58.454855 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" event={"ID":"b519ad8a-6e9c-4c15-a477-70837b26948a","Type":"ContainerStarted","Data":"372540a6cc23ee6bc1c72efe5f59cdcbbd930c7ceb49c49f2bc7c8a69967ca7b"} Apr 23 08:55:01.469675 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:01.469638 2574 generic.go:358] "Generic (PLEG): container finished" podID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerID="ea5912b3b1c22371021f15e22da95781291328654026700151c834a7ad6a9a32" exitCode=0 Apr 23 08:55:01.470148 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:01.469687 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" event={"ID":"b519ad8a-6e9c-4c15-a477-70837b26948a","Type":"ContainerDied","Data":"ea5912b3b1c22371021f15e22da95781291328654026700151c834a7ad6a9a32"} Apr 23 08:55:02.475117 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:02.475079 2574 generic.go:358] "Generic (PLEG): container finished" podID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerID="e55005d1c04eb4b14949847fea43737935c0bd53fe9434e5b18eb59952e113df" exitCode=0 Apr 23 08:55:02.475526 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:02.475163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" event={"ID":"b519ad8a-6e9c-4c15-a477-70837b26948a","Type":"ContainerDied","Data":"e55005d1c04eb4b14949847fea43737935c0bd53fe9434e5b18eb59952e113df"} Apr 23 08:55:03.601928 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.601906 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:55:03.763694 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.763617 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-util\") pod \"b519ad8a-6e9c-4c15-a477-70837b26948a\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " Apr 23 08:55:03.763837 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.763699 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnrjp\" (UniqueName: \"kubernetes.io/projected/b519ad8a-6e9c-4c15-a477-70837b26948a-kube-api-access-hnrjp\") pod \"b519ad8a-6e9c-4c15-a477-70837b26948a\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " Apr 23 08:55:03.763837 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.763717 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-bundle\") pod \"b519ad8a-6e9c-4c15-a477-70837b26948a\" (UID: \"b519ad8a-6e9c-4c15-a477-70837b26948a\") " Apr 23 08:55:03.764175 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.764138 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-bundle" (OuterVolumeSpecName: "bundle") pod "b519ad8a-6e9c-4c15-a477-70837b26948a" (UID: "b519ad8a-6e9c-4c15-a477-70837b26948a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:55:03.765923 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.765897 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b519ad8a-6e9c-4c15-a477-70837b26948a-kube-api-access-hnrjp" (OuterVolumeSpecName: "kube-api-access-hnrjp") pod "b519ad8a-6e9c-4c15-a477-70837b26948a" (UID: "b519ad8a-6e9c-4c15-a477-70837b26948a"). InnerVolumeSpecName "kube-api-access-hnrjp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:55:03.768328 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.768308 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-util" (OuterVolumeSpecName: "util") pod "b519ad8a-6e9c-4c15-a477-70837b26948a" (UID: "b519ad8a-6e9c-4c15-a477-70837b26948a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:55:03.864653 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.864626 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnrjp\" (UniqueName: \"kubernetes.io/projected/b519ad8a-6e9c-4c15-a477-70837b26948a-kube-api-access-hnrjp\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:55:03.864653 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.864648 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-bundle\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:55:03.864801 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:03.864659 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b519ad8a-6e9c-4c15-a477-70837b26948a-util\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:55:04.483498 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:04.483464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" event={"ID":"b519ad8a-6e9c-4c15-a477-70837b26948a","Type":"ContainerDied","Data":"372540a6cc23ee6bc1c72efe5f59cdcbbd930c7ceb49c49f2bc7c8a69967ca7b"} Apr 23 08:55:04.483663 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:04.483508 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372540a6cc23ee6bc1c72efe5f59cdcbbd930c7ceb49c49f2bc7c8a69967ca7b" Apr 23 08:55:04.483663 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:04.483482 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eqhl7t" Apr 23 08:55:09.125696 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.125666 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w"] Apr 23 08:55:09.126150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.125981 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerName="pull" Apr 23 08:55:09.126150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.125991 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerName="pull" Apr 23 08:55:09.126150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.126001 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerName="extract" Apr 23 08:55:09.126150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.126008 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerName="extract" Apr 23 08:55:09.126150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.126025 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerName="util" Apr 23 08:55:09.126150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.126031 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerName="util" Apr 23 08:55:09.126150 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.126089 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b519ad8a-6e9c-4c15-a477-70837b26948a" containerName="extract" Apr 23 08:55:09.130112 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.130094 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.133775 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.133754 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 08:55:09.136609 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.136587 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:55:09.147894 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.147871 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-mpdpj\"" Apr 23 08:55:09.149666 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.149647 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w"] Apr 23 08:55:09.208870 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.208844 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98dd6dad-70fe-4607-ac31-47a0875fb851-tmp\") pod \"jobset-operator-747c5859c7-t8l5w\" (UID: \"98dd6dad-70fe-4607-ac31-47a0875fb851\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.208998 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.208894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh22z\" (UniqueName: \"kubernetes.io/projected/98dd6dad-70fe-4607-ac31-47a0875fb851-kube-api-access-sh22z\") pod \"jobset-operator-747c5859c7-t8l5w\" (UID: \"98dd6dad-70fe-4607-ac31-47a0875fb851\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.310244 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.310216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98dd6dad-70fe-4607-ac31-47a0875fb851-tmp\") pod \"jobset-operator-747c5859c7-t8l5w\" (UID: \"98dd6dad-70fe-4607-ac31-47a0875fb851\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.310362 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.310263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh22z\" (UniqueName: \"kubernetes.io/projected/98dd6dad-70fe-4607-ac31-47a0875fb851-kube-api-access-sh22z\") pod \"jobset-operator-747c5859c7-t8l5w\" (UID: \"98dd6dad-70fe-4607-ac31-47a0875fb851\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.310601 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.310583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98dd6dad-70fe-4607-ac31-47a0875fb851-tmp\") pod \"jobset-operator-747c5859c7-t8l5w\" (UID: \"98dd6dad-70fe-4607-ac31-47a0875fb851\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.318991 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.318974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh22z\" (UniqueName: \"kubernetes.io/projected/98dd6dad-70fe-4607-ac31-47a0875fb851-kube-api-access-sh22z\") pod \"jobset-operator-747c5859c7-t8l5w\" (UID: \"98dd6dad-70fe-4607-ac31-47a0875fb851\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.438967 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.438900 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" Apr 23 08:55:09.562286 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:09.562255 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w"] Apr 23 08:55:09.564733 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:55:09.564704 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98dd6dad_70fe_4607_ac31_47a0875fb851.slice/crio-d79c5e2f9667d8c01ea67aa9ed5b6499b8de5edcfdb4ded7a37179a12347b5be WatchSource:0}: Error finding container d79c5e2f9667d8c01ea67aa9ed5b6499b8de5edcfdb4ded7a37179a12347b5be: Status 404 returned error can't find the container with id d79c5e2f9667d8c01ea67aa9ed5b6499b8de5edcfdb4ded7a37179a12347b5be Apr 23 08:55:10.506141 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:10.506101 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" event={"ID":"98dd6dad-70fe-4607-ac31-47a0875fb851","Type":"ContainerStarted","Data":"d79c5e2f9667d8c01ea67aa9ed5b6499b8de5edcfdb4ded7a37179a12347b5be"} Apr 23 08:55:12.514407 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:12.514350 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" event={"ID":"98dd6dad-70fe-4607-ac31-47a0875fb851","Type":"ContainerStarted","Data":"832c678a0fdd4c30ee2b912d6b7f28287c8bfc68e543c8c1bbca0ecdabb25cd0"} Apr 23 08:55:12.532541 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:12.532492 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-t8l5w" podStartSLOduration=1.586196423 podStartE2EDuration="3.532477213s" podCreationTimestamp="2026-04-23 08:55:09 +0000 UTC" firstStartedPulling="2026-04-23 08:55:09.566160362 +0000 UTC m=+421.083019678" lastFinishedPulling="2026-04-23 08:55:11.512441139 +0000 UTC m=+423.029300468" observedRunningTime="2026-04-23 08:55:12.53055279 +0000 UTC m=+424.047412127" watchObservedRunningTime="2026-04-23 08:55:12.532477213 +0000 UTC m=+424.049336548" Apr 23 08:55:37.135110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.135032 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb"] Apr 23 08:55:37.139401 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.139365 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.142345 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.142318 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 23 08:55:37.143604 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.143584 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 23 08:55:37.143700 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.143584 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-9rrlc\"" Apr 23 08:55:37.143700 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.143588 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 08:55:37.143779 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.143592 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 08:55:37.148056 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.148036 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb"] Apr 23 08:55:37.246056 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.246022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzw6n\" (UniqueName: \"kubernetes.io/projected/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-kube-api-access-zzw6n\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.246229 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.246089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.246229 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.246119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.346787 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.346757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.346930 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.346807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzw6n\" (UniqueName: \"kubernetes.io/projected/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-kube-api-access-zzw6n\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.346930 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.346860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.347527 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.347461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.349147 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.349128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.358118 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.358094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzw6n\" (UniqueName: \"kubernetes.io/projected/71e58cf0-1b23-40b5-8471-cc7b8bba74ff-kube-api-access-zzw6n\") pod \"kubeflow-trainer-controller-manager-55f5694779-gg5zb\" (UID: \"71e58cf0-1b23-40b5-8471-cc7b8bba74ff\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.449524 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.449432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:37.584575 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.584543 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb"] Apr 23 08:55:37.586892 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:55:37.586865 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71e58cf0_1b23_40b5_8471_cc7b8bba74ff.slice/crio-903af4e657036bfb8cbc2659e0034bbecdf1e7b6347da1e87c40871866368035 WatchSource:0}: Error finding container 903af4e657036bfb8cbc2659e0034bbecdf1e7b6347da1e87c40871866368035: Status 404 returned error can't find the container with id 903af4e657036bfb8cbc2659e0034bbecdf1e7b6347da1e87c40871866368035 Apr 23 08:55:37.601122 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:37.601094 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" event={"ID":"71e58cf0-1b23-40b5-8471-cc7b8bba74ff","Type":"ContainerStarted","Data":"903af4e657036bfb8cbc2659e0034bbecdf1e7b6347da1e87c40871866368035"} Apr 23 08:55:40.613675 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:40.613640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" event={"ID":"71e58cf0-1b23-40b5-8471-cc7b8bba74ff","Type":"ContainerStarted","Data":"e3a876b1fde9567fac835f70de5a16e4205b046012b1857405adbda71a324451"} Apr 23 08:55:40.614101 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:40.613942 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:55:40.633992 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:40.633935 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" podStartSLOduration=1.119019561 podStartE2EDuration="3.633918855s" podCreationTimestamp="2026-04-23 08:55:37 +0000 UTC" firstStartedPulling="2026-04-23 08:55:37.588612134 +0000 UTC m=+449.105471450" lastFinishedPulling="2026-04-23 08:55:40.103511428 +0000 UTC m=+451.620370744" observedRunningTime="2026-04-23 08:55:40.632448926 +0000 UTC m=+452.149308261" watchObservedRunningTime="2026-04-23 08:55:40.633918855 +0000 UTC m=+452.150778191" Apr 23 08:55:56.622216 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:55:56.622182 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-gg5zb" Apr 23 08:57:28.678482 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.678403 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-99467696b-gfgtw"] Apr 23 08:57:28.681734 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.681714 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.690928 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.690906 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-99467696b-gfgtw"] Apr 23 08:57:28.777559 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.777524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-oauth-config\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.777559 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.777561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-oauth-serving-cert\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.777790 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.777592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-service-ca\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.777790 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.777709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-config\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.777790 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.777748 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-trusted-ca-bundle\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.777912 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.777791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfqmq\" (UniqueName: \"kubernetes.io/projected/1a3a79d8-6d77-452b-8aed-c19229cc6530-kube-api-access-cfqmq\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.777912 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.777829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-serving-cert\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.878943 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.878916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-service-ca\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879098 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.878966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-config\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879098 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.878985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-trusted-ca-bundle\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879213 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfqmq\" (UniqueName: \"kubernetes.io/projected/1a3a79d8-6d77-452b-8aed-c19229cc6530-kube-api-access-cfqmq\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879213 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-serving-cert\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879213 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-oauth-config\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879417 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-oauth-serving-cert\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879739 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-service-ca\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879823 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-config\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879888 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-trusted-ca-bundle\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.879950 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.879934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a3a79d8-6d77-452b-8aed-c19229cc6530-oauth-serving-cert\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.882318 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.882296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-serving-cert\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.882318 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.882309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a3a79d8-6d77-452b-8aed-c19229cc6530-console-oauth-config\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.887982 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.887963 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfqmq\" (UniqueName: \"kubernetes.io/projected/1a3a79d8-6d77-452b-8aed-c19229cc6530-kube-api-access-cfqmq\") pod \"console-99467696b-gfgtw\" (UID: \"1a3a79d8-6d77-452b-8aed-c19229cc6530\") " pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:28.991374 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:28.991297 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:29.112376 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:29.112349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-99467696b-gfgtw"] Apr 23 08:57:29.114379 ip-10-0-139-48 kubenswrapper[2574]: W0423 08:57:29.114353 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a3a79d8_6d77_452b_8aed_c19229cc6530.slice/crio-6bb27f73742b9ac10091f0ce6d6590e42d21b727943d4e04c252c5914351323c WatchSource:0}: Error finding container 6bb27f73742b9ac10091f0ce6d6590e42d21b727943d4e04c252c5914351323c: Status 404 returned error can't find the container with id 6bb27f73742b9ac10091f0ce6d6590e42d21b727943d4e04c252c5914351323c Apr 23 08:57:30.000039 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:30.000006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99467696b-gfgtw" event={"ID":"1a3a79d8-6d77-452b-8aed-c19229cc6530","Type":"ContainerStarted","Data":"ee43afc77095ca698a4012025b7247f6008628f185c6c798c7f7718690db8b4d"} Apr 23 08:57:30.000039 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:30.000041 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-99467696b-gfgtw" event={"ID":"1a3a79d8-6d77-452b-8aed-c19229cc6530","Type":"ContainerStarted","Data":"6bb27f73742b9ac10091f0ce6d6590e42d21b727943d4e04c252c5914351323c"} Apr 23 08:57:30.020838 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:30.020792 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-99467696b-gfgtw" podStartSLOduration=2.020778527 podStartE2EDuration="2.020778527s" podCreationTimestamp="2026-04-23 08:57:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:57:30.018994137 +0000 UTC m=+561.535853481" watchObservedRunningTime="2026-04-23 08:57:30.020778527 +0000 UTC m=+561.537637862" Apr 23 08:57:38.992110 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:38.992070 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:38.992611 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:38.992147 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:38.996942 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:38.996920 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:39.034165 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:39.034142 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-99467696b-gfgtw" Apr 23 08:57:39.084421 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:57:39.084368 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bc9557b5f-krddp"] Apr 23 08:58:04.104784 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.104730 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bc9557b5f-krddp" podUID="fc9dcdc7-2c30-4e50-bca9-015e477af7e8" containerName="console" containerID="cri-o://de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8" gracePeriod=15 Apr 23 08:58:04.343425 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.343401 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bc9557b5f-krddp_fc9dcdc7-2c30-4e50-bca9-015e477af7e8/console/0.log" Apr 23 08:58:04.343545 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.343462 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:58:04.485518 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485434 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-config\") pod \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " Apr 23 08:58:04.485518 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485512 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-serving-cert\") pod \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " Apr 23 08:58:04.485751 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485543 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-oauth-serving-cert\") pod \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " Apr 23 08:58:04.485751 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485593 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-trusted-ca-bundle\") pod \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " Apr 23 08:58:04.485751 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485621 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-oauth-config\") pod \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " Apr 23 08:58:04.485751 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485655 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnxq8\" (UniqueName: \"kubernetes.io/projected/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-kube-api-access-bnxq8\") pod \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " Apr 23 08:58:04.485751 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485692 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-service-ca\") pod \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\" (UID: \"fc9dcdc7-2c30-4e50-bca9-015e477af7e8\") " Apr 23 08:58:04.486000 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.485954 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fc9dcdc7-2c30-4e50-bca9-015e477af7e8" (UID: "fc9dcdc7-2c30-4e50-bca9-015e477af7e8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:58:04.486144 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.486115 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-config" (OuterVolumeSpecName: "console-config") pod "fc9dcdc7-2c30-4e50-bca9-015e477af7e8" (UID: "fc9dcdc7-2c30-4e50-bca9-015e477af7e8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:58:04.486218 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.486138 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fc9dcdc7-2c30-4e50-bca9-015e477af7e8" (UID: "fc9dcdc7-2c30-4e50-bca9-015e477af7e8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:58:04.486266 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.486243 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-service-ca" (OuterVolumeSpecName: "service-ca") pod "fc9dcdc7-2c30-4e50-bca9-015e477af7e8" (UID: "fc9dcdc7-2c30-4e50-bca9-015e477af7e8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:58:04.487929 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.487907 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fc9dcdc7-2c30-4e50-bca9-015e477af7e8" (UID: "fc9dcdc7-2c30-4e50-bca9-015e477af7e8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:58:04.488035 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.488009 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fc9dcdc7-2c30-4e50-bca9-015e477af7e8" (UID: "fc9dcdc7-2c30-4e50-bca9-015e477af7e8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:58:04.488035 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.488017 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-kube-api-access-bnxq8" (OuterVolumeSpecName: "kube-api-access-bnxq8") pod "fc9dcdc7-2c30-4e50-bca9-015e477af7e8" (UID: "fc9dcdc7-2c30-4e50-bca9-015e477af7e8"). InnerVolumeSpecName "kube-api-access-bnxq8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:58:04.586470 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.586431 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:58:04.586470 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.586467 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:58:04.586470 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.586477 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-oauth-serving-cert\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:58:04.586651 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.586486 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-trusted-ca-bundle\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:58:04.586651 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.586496 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-console-oauth-config\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:58:04.586651 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.586505 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bnxq8\" (UniqueName: \"kubernetes.io/projected/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-kube-api-access-bnxq8\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:58:04.586651 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:04.586514 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc9dcdc7-2c30-4e50-bca9-015e477af7e8-service-ca\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 08:58:05.119474 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.119446 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bc9557b5f-krddp_fc9dcdc7-2c30-4e50-bca9-015e477af7e8/console/0.log" Apr 23 08:58:05.119834 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.119489 2574 generic.go:358] "Generic (PLEG): container finished" podID="fc9dcdc7-2c30-4e50-bca9-015e477af7e8" containerID="de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8" exitCode=2 Apr 23 08:58:05.119834 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.119522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc9557b5f-krddp" event={"ID":"fc9dcdc7-2c30-4e50-bca9-015e477af7e8","Type":"ContainerDied","Data":"de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8"} Apr 23 08:58:05.119834 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.119566 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc9557b5f-krddp" Apr 23 08:58:05.119834 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.119577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc9557b5f-krddp" event={"ID":"fc9dcdc7-2c30-4e50-bca9-015e477af7e8","Type":"ContainerDied","Data":"199a0fabd24855daa18958eb36e821dc2b3265e9a9955adc4d01f94138e0b5e4"} Apr 23 08:58:05.119834 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.119600 2574 scope.go:117] "RemoveContainer" containerID="de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8" Apr 23 08:58:05.128022 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.128004 2574 scope.go:117] "RemoveContainer" containerID="de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8" Apr 23 08:58:05.128290 ip-10-0-139-48 kubenswrapper[2574]: E0423 08:58:05.128270 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8\": container with ID starting with de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8 not found: ID does not exist" containerID="de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8" Apr 23 08:58:05.128340 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.128298 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8"} err="failed to get container status \"de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8\": rpc error: code = NotFound desc = could not find container \"de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8\": container with ID starting with de4ad3be8be9210bcee70590b0b4924e02c6e38c7fba87a9c98fca903d94a1b8 not found: ID does not exist" Apr 23 08:58:05.139875 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.139851 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bc9557b5f-krddp"] Apr 23 08:58:05.144077 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:05.144060 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bc9557b5f-krddp"] Apr 23 08:58:07.072932 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:07.072892 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9dcdc7-2c30-4e50-bca9-015e477af7e8" path="/var/lib/kubelet/pods/fc9dcdc7-2c30-4e50-bca9-015e477af7e8/volumes" Apr 23 08:58:08.980789 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:08.980756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 08:58:08.981228 ip-10-0-139-48 kubenswrapper[2574]: I0423 08:58:08.981193 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:03:09.009908 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:03:09.009836 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:03:09.016259 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:03:09.016241 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:06:47.323234 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.323199 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd"] Apr 23 09:06:47.323756 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.323595 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc9dcdc7-2c30-4e50-bca9-015e477af7e8" containerName="console" Apr 23 09:06:47.323756 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.323608 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9dcdc7-2c30-4e50-bca9-015e477af7e8" containerName="console" Apr 23 09:06:47.323756 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.323664 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc9dcdc7-2c30-4e50-bca9-015e477af7e8" containerName="console" Apr 23 09:06:47.326583 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.326566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" Apr 23 09:06:47.328124 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.328096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzb7v\" (UniqueName: \"kubernetes.io/projected/5205e70f-35e3-4df5-a1e7-ed71364a7ec0-kube-api-access-dzb7v\") pod \"test-trainjob-jzhrk-node-0-0-jzshd\" (UID: \"5205e70f-35e3-4df5-a1e7-ed71364a7ec0\") " pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" Apr 23 09:06:47.329485 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.329460 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2xbdq\"/\"openshift-service-ca.crt\"" Apr 23 09:06:47.329584 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.329505 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2xbdq\"/\"kube-root-ca.crt\"" Apr 23 09:06:47.330627 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.330612 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-2xbdq\"/\"default-dockercfg-d9kz8\"" Apr 23 09:06:47.334755 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.334733 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd"] Apr 23 09:06:47.428441 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.428407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzb7v\" (UniqueName: \"kubernetes.io/projected/5205e70f-35e3-4df5-a1e7-ed71364a7ec0-kube-api-access-dzb7v\") pod \"test-trainjob-jzhrk-node-0-0-jzshd\" (UID: \"5205e70f-35e3-4df5-a1e7-ed71364a7ec0\") " pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" Apr 23 09:06:47.437133 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.437109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzb7v\" (UniqueName: \"kubernetes.io/projected/5205e70f-35e3-4df5-a1e7-ed71364a7ec0-kube-api-access-dzb7v\") pod \"test-trainjob-jzhrk-node-0-0-jzshd\" (UID: \"5205e70f-35e3-4df5-a1e7-ed71364a7ec0\") " pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" Apr 23 09:06:47.637092 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.637066 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" Apr 23 09:06:47.762707 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.762676 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd"] Apr 23 09:06:47.764313 ip-10-0-139-48 kubenswrapper[2574]: W0423 09:06:47.764287 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5205e70f_35e3_4df5_a1e7_ed71364a7ec0.slice/crio-b61b559895bff426d74ba093d710f6dbb2972584cb9b7222cf501ef45aca0c60 WatchSource:0}: Error finding container b61b559895bff426d74ba093d710f6dbb2972584cb9b7222cf501ef45aca0c60: Status 404 returned error can't find the container with id b61b559895bff426d74ba093d710f6dbb2972584cb9b7222cf501ef45aca0c60 Apr 23 09:06:47.766344 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.766327 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:06:47.947930 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:06:47.947848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" event={"ID":"5205e70f-35e3-4df5-a1e7-ed71364a7ec0","Type":"ContainerStarted","Data":"b61b559895bff426d74ba093d710f6dbb2972584cb9b7222cf501ef45aca0c60"} Apr 23 09:08:09.039933 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:09.039910 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:08:09.046648 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:09.046630 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:08:09.273461 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:09.273350 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" event={"ID":"5205e70f-35e3-4df5-a1e7-ed71364a7ec0","Type":"ContainerStarted","Data":"36632fe4220c40e837beb032535e826ceb55a4d0676d069ad9358863a4f15aaa"} Apr 23 09:08:09.292309 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:09.292248 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" podStartSLOduration=1.0998363310000001 podStartE2EDuration="1m22.292232561s" podCreationTimestamp="2026-04-23 09:06:47 +0000 UTC" firstStartedPulling="2026-04-23 09:06:47.766483748 +0000 UTC m=+1119.283343061" lastFinishedPulling="2026-04-23 09:08:08.958879979 +0000 UTC m=+1200.475739291" observedRunningTime="2026-04-23 09:08:09.290763255 +0000 UTC m=+1200.807622592" watchObservedRunningTime="2026-04-23 09:08:09.292232561 +0000 UTC m=+1200.809091893" Apr 23 09:08:12.284513 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:12.284477 2574 generic.go:358] "Generic (PLEG): container finished" podID="5205e70f-35e3-4df5-a1e7-ed71364a7ec0" containerID="36632fe4220c40e837beb032535e826ceb55a4d0676d069ad9358863a4f15aaa" exitCode=0 Apr 23 09:08:12.284902 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:12.284553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" event={"ID":"5205e70f-35e3-4df5-a1e7-ed71364a7ec0","Type":"ContainerDied","Data":"36632fe4220c40e837beb032535e826ceb55a4d0676d069ad9358863a4f15aaa"} Apr 23 09:08:13.412795 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:13.412768 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" Apr 23 09:08:13.520177 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:13.520148 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzb7v\" (UniqueName: \"kubernetes.io/projected/5205e70f-35e3-4df5-a1e7-ed71364a7ec0-kube-api-access-dzb7v\") pod \"5205e70f-35e3-4df5-a1e7-ed71364a7ec0\" (UID: \"5205e70f-35e3-4df5-a1e7-ed71364a7ec0\") " Apr 23 09:08:13.522446 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:13.522416 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5205e70f-35e3-4df5-a1e7-ed71364a7ec0-kube-api-access-dzb7v" (OuterVolumeSpecName: "kube-api-access-dzb7v") pod "5205e70f-35e3-4df5-a1e7-ed71364a7ec0" (UID: "5205e70f-35e3-4df5-a1e7-ed71364a7ec0"). InnerVolumeSpecName "kube-api-access-dzb7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:08:13.620957 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:13.620928 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzb7v\" (UniqueName: \"kubernetes.io/projected/5205e70f-35e3-4df5-a1e7-ed71364a7ec0-kube-api-access-dzb7v\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 09:08:14.292333 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:14.292307 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" Apr 23 09:08:14.292333 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:14.292314 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd" event={"ID":"5205e70f-35e3-4df5-a1e7-ed71364a7ec0","Type":"ContainerDied","Data":"b61b559895bff426d74ba093d710f6dbb2972584cb9b7222cf501ef45aca0c60"} Apr 23 09:08:14.292549 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:08:14.292352 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61b559895bff426d74ba093d710f6dbb2972584cb9b7222cf501ef45aca0c60" Apr 23 09:13:09.074900 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:13:09.074871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:13:09.083143 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:13:09.083124 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:18:09.104731 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:18:09.104627 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:18:09.112829 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:18:09.112811 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:23:09.133418 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:23:09.133304 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:23:09.149239 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:23:09.149216 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:27:01.530206 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:01.530116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-2xbdq_test-trainjob-jzhrk-node-0-0-jzshd_5205e70f-35e3-4df5-a1e7-ed71364a7ec0/node/0.log" Apr 23 09:27:03.402335 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.402298 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n2tr/must-gather-5s7gn"] Apr 23 09:27:03.402755 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.402668 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5205e70f-35e3-4df5-a1e7-ed71364a7ec0" containerName="node" Apr 23 09:27:03.402755 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.402679 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5205e70f-35e3-4df5-a1e7-ed71364a7ec0" containerName="node" Apr 23 09:27:03.402830 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.402762 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5205e70f-35e3-4df5-a1e7-ed71364a7ec0" containerName="node" Apr 23 09:27:03.405760 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.405741 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.408905 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.408878 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2n2tr\"/\"kube-root-ca.crt\"" Apr 23 09:27:03.409013 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.408916 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2n2tr\"/\"openshift-service-ca.crt\"" Apr 23 09:27:03.410346 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.410324 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2n2tr\"/\"default-dockercfg-92z5t\"" Apr 23 09:27:03.412584 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.412563 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2n2tr/must-gather-5s7gn"] Apr 23 09:27:03.493266 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.493230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5786b907-1f0c-4733-8a34-f9251530b653-must-gather-output\") pod \"must-gather-5s7gn\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.493453 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.493272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwqhj\" (UniqueName: \"kubernetes.io/projected/5786b907-1f0c-4733-8a34-f9251530b653-kube-api-access-gwqhj\") pod \"must-gather-5s7gn\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.594073 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.594031 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwqhj\" (UniqueName: \"kubernetes.io/projected/5786b907-1f0c-4733-8a34-f9251530b653-kube-api-access-gwqhj\") pod \"must-gather-5s7gn\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.594251 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.594150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5786b907-1f0c-4733-8a34-f9251530b653-must-gather-output\") pod \"must-gather-5s7gn\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.594511 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.594492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5786b907-1f0c-4733-8a34-f9251530b653-must-gather-output\") pod \"must-gather-5s7gn\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.602635 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.602609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwqhj\" (UniqueName: \"kubernetes.io/projected/5786b907-1f0c-4733-8a34-f9251530b653-kube-api-access-gwqhj\") pod \"must-gather-5s7gn\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.715707 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.715635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:03.839872 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.839844 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2n2tr/must-gather-5s7gn"] Apr 23 09:27:03.841773 ip-10-0-139-48 kubenswrapper[2574]: W0423 09:27:03.841741 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5786b907_1f0c_4733_8a34_f9251530b653.slice/crio-edaf0499ef5f678dfbaed1e6646fdf52b55a18a59bdb7154adfbe5b8492cb851 WatchSource:0}: Error finding container edaf0499ef5f678dfbaed1e6646fdf52b55a18a59bdb7154adfbe5b8492cb851: Status 404 returned error can't find the container with id edaf0499ef5f678dfbaed1e6646fdf52b55a18a59bdb7154adfbe5b8492cb851 Apr 23 09:27:03.843573 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:03.843556 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:27:04.286578 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:04.286540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" event={"ID":"5786b907-1f0c-4733-8a34-f9251530b653","Type":"ContainerStarted","Data":"edaf0499ef5f678dfbaed1e6646fdf52b55a18a59bdb7154adfbe5b8492cb851"} Apr 23 09:27:06.563809 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:06.563741 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd"] Apr 23 09:27:06.566101 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:06.566074 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-2xbdq/test-trainjob-jzhrk-node-0-0-jzshd"] Apr 23 09:27:07.070333 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:07.070301 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5205e70f-35e3-4df5-a1e7-ed71364a7ec0" path="/var/lib/kubelet/pods/5205e70f-35e3-4df5-a1e7-ed71364a7ec0/volumes" Apr 23 09:27:09.150967 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:09.150938 2574 scope.go:117] "RemoveContainer" containerID="36632fe4220c40e837beb032535e826ceb55a4d0676d069ad9358863a4f15aaa" Apr 23 09:27:09.308802 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:09.308762 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" event={"ID":"5786b907-1f0c-4733-8a34-f9251530b653","Type":"ContainerStarted","Data":"005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea"} Apr 23 09:27:09.309042 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:09.309020 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" event={"ID":"5786b907-1f0c-4733-8a34-f9251530b653","Type":"ContainerStarted","Data":"8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b"} Apr 23 09:27:09.327438 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:09.327365 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" podStartSLOduration=1.865839409 podStartE2EDuration="6.327348335s" podCreationTimestamp="2026-04-23 09:27:03 +0000 UTC" firstStartedPulling="2026-04-23 09:27:03.84368343 +0000 UTC m=+2335.360542744" lastFinishedPulling="2026-04-23 09:27:08.305192347 +0000 UTC m=+2339.822051670" observedRunningTime="2026-04-23 09:27:09.326206109 +0000 UTC m=+2340.843065444" watchObservedRunningTime="2026-04-23 09:27:09.327348335 +0000 UTC m=+2340.844207673" Apr 23 09:27:17.867723 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:17.867683 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-gg5zb_71e58cf0-1b23-40b5-8471-cc7b8bba74ff/manager/0.log" Apr 23 09:27:18.291674 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:18.291552 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-gg5zb_71e58cf0-1b23-40b5-8471-cc7b8bba74ff/manager/0.log" Apr 23 09:27:18.709270 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:18.709242 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-gg5zb_71e58cf0-1b23-40b5-8471-cc7b8bba74ff/manager/0.log" Apr 23 09:27:53.482494 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:53.482457 2574 generic.go:358] "Generic (PLEG): container finished" podID="5786b907-1f0c-4733-8a34-f9251530b653" containerID="8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b" exitCode=0 Apr 23 09:27:53.482992 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:53.482535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" event={"ID":"5786b907-1f0c-4733-8a34-f9251530b653","Type":"ContainerDied","Data":"8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b"} Apr 23 09:27:53.482992 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:53.482908 2574 scope.go:117] "RemoveContainer" containerID="8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b" Apr 23 09:27:53.638896 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:53.638860 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n2tr_must-gather-5s7gn_5786b907-1f0c-4733-8a34-f9251530b653/gather/0.log" Apr 23 09:27:57.181475 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:57.181428 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qtlms_1ee7a30a-d0ec-443c-89c2-577e277215a1/global-pull-secret-syncer/0.log" Apr 23 09:27:57.261524 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:57.261495 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-td2mf_ee9e598b-755e-47f8-95fd-9e72a79972ac/konnectivity-agent/0.log" Apr 23 09:27:57.368721 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:57.368691 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-48.ec2.internal_20c73c33a3b7b77bd6beaa98c127d630/haproxy/0.log" Apr 23 09:27:59.010382 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.010351 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n2tr/must-gather-5s7gn"] Apr 23 09:27:59.010891 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.010565 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" podUID="5786b907-1f0c-4733-8a34-f9251530b653" containerName="copy" containerID="cri-o://005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea" gracePeriod=2 Apr 23 09:27:59.013436 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.013378 2574 status_manager.go:895] "Failed to get status for pod" podUID="5786b907-1f0c-4733-8a34-f9251530b653" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" err="pods \"must-gather-5s7gn\" is forbidden: User \"system:node:ip-10-0-139-48.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-2n2tr\": no relationship found between node 'ip-10-0-139-48.ec2.internal' and this object" Apr 23 09:27:59.013822 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.013788 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n2tr/must-gather-5s7gn"] Apr 23 09:27:59.070580 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.070549 2574 status_manager.go:895] "Failed to get status for pod" podUID="5786b907-1f0c-4733-8a34-f9251530b653" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" err="pods \"must-gather-5s7gn\" is forbidden: User \"system:node:ip-10-0-139-48.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-2n2tr\": no relationship found between node 'ip-10-0-139-48.ec2.internal' and this object" Apr 23 09:27:59.245791 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.245771 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n2tr_must-gather-5s7gn_5786b907-1f0c-4733-8a34-f9251530b653/copy/0.log" Apr 23 09:27:59.246125 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.246110 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:59.394206 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.394182 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwqhj\" (UniqueName: \"kubernetes.io/projected/5786b907-1f0c-4733-8a34-f9251530b653-kube-api-access-gwqhj\") pod \"5786b907-1f0c-4733-8a34-f9251530b653\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " Apr 23 09:27:59.394359 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.394274 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5786b907-1f0c-4733-8a34-f9251530b653-must-gather-output\") pod \"5786b907-1f0c-4733-8a34-f9251530b653\" (UID: \"5786b907-1f0c-4733-8a34-f9251530b653\") " Apr 23 09:27:59.396476 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.396452 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5786b907-1f0c-4733-8a34-f9251530b653-kube-api-access-gwqhj" (OuterVolumeSpecName: "kube-api-access-gwqhj") pod "5786b907-1f0c-4733-8a34-f9251530b653" (UID: "5786b907-1f0c-4733-8a34-f9251530b653"). InnerVolumeSpecName "kube-api-access-gwqhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:27:59.396476 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.396469 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5786b907-1f0c-4733-8a34-f9251530b653-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5786b907-1f0c-4733-8a34-f9251530b653" (UID: "5786b907-1f0c-4733-8a34-f9251530b653"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:27:59.494809 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.494784 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5786b907-1f0c-4733-8a34-f9251530b653-must-gather-output\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 09:27:59.494809 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.494807 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwqhj\" (UniqueName: \"kubernetes.io/projected/5786b907-1f0c-4733-8a34-f9251530b653-kube-api-access-gwqhj\") on node \"ip-10-0-139-48.ec2.internal\" DevicePath \"\"" Apr 23 09:27:59.503678 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.503659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n2tr_must-gather-5s7gn_5786b907-1f0c-4733-8a34-f9251530b653/copy/0.log" Apr 23 09:27:59.503956 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.503932 2574 generic.go:358] "Generic (PLEG): container finished" podID="5786b907-1f0c-4733-8a34-f9251530b653" containerID="005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea" exitCode=143 Apr 23 09:27:59.504019 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.503991 2574 scope.go:117] "RemoveContainer" containerID="005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea" Apr 23 09:27:59.504057 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.503992 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n2tr/must-gather-5s7gn" Apr 23 09:27:59.511966 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.511946 2574 scope.go:117] "RemoveContainer" containerID="8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b" Apr 23 09:27:59.524181 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.524162 2574 scope.go:117] "RemoveContainer" containerID="005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea" Apr 23 09:27:59.524427 ip-10-0-139-48 kubenswrapper[2574]: E0423 09:27:59.524405 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea\": container with ID starting with 005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea not found: ID does not exist" containerID="005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea" Apr 23 09:27:59.524471 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.524443 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea"} err="failed to get container status \"005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea\": rpc error: code = NotFound desc = could not find container \"005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea\": container with ID starting with 005287097b69eb67fefd26b43c1bb5effd4014b6a201f94b949984d4f59711ea not found: ID does not exist" Apr 23 09:27:59.524471 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.524466 2574 scope.go:117] "RemoveContainer" containerID="8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b" Apr 23 09:27:59.524682 ip-10-0-139-48 kubenswrapper[2574]: E0423 09:27:59.524665 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b\": container with ID starting with 8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b not found: ID does not exist" containerID="8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b" Apr 23 09:27:59.524718 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:27:59.524688 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b"} err="failed to get container status \"8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b\": rpc error: code = NotFound desc = could not find container \"8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b\": container with ID starting with 8c94fbf5a09ae3bedbf0d6cf25ca2011bcec2a11773f37503f1463697513c29b not found: ID does not exist" Apr 23 09:28:00.102724 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.102682 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_088441f8-e5d3-4665-81e0-ded5627fa821/alertmanager/0.log" Apr 23 09:28:00.143436 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.143413 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_088441f8-e5d3-4665-81e0-ded5627fa821/config-reloader/0.log" Apr 23 09:28:00.178313 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.178292 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_088441f8-e5d3-4665-81e0-ded5627fa821/kube-rbac-proxy-web/0.log" Apr 23 09:28:00.216509 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.216480 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_088441f8-e5d3-4665-81e0-ded5627fa821/kube-rbac-proxy/0.log" Apr 23 09:28:00.253181 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.253163 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_088441f8-e5d3-4665-81e0-ded5627fa821/kube-rbac-proxy-metric/0.log" Apr 23 09:28:00.283636 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.283618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_088441f8-e5d3-4665-81e0-ded5627fa821/prom-label-proxy/0.log" Apr 23 09:28:00.319197 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.319181 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_088441f8-e5d3-4665-81e0-ded5627fa821/init-config-reloader/0.log" Apr 23 09:28:00.393681 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.393616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jjhq5_be5352cb-7773-4886-82c2-b96726cfae23/kube-state-metrics/0.log" Apr 23 09:28:00.417223 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.417197 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jjhq5_be5352cb-7773-4886-82c2-b96726cfae23/kube-rbac-proxy-main/0.log" Apr 23 09:28:00.444472 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.444449 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jjhq5_be5352cb-7773-4886-82c2-b96726cfae23/kube-rbac-proxy-self/0.log" Apr 23 09:28:00.477275 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.477252 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-666ff68b9f-kzfmt_2ed1b7b6-6eff-4336-96e8-84d4433bb4bc/metrics-server/0.log" Apr 23 09:28:00.620258 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.620234 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-htrdc_32bc556f-c293-4692-aaf0-8ddd74da2d7e/node-exporter/0.log" Apr 23 09:28:00.653163 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.653108 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-htrdc_32bc556f-c293-4692-aaf0-8ddd74da2d7e/kube-rbac-proxy/0.log" Apr 23 09:28:00.683999 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.683983 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-htrdc_32bc556f-c293-4692-aaf0-8ddd74da2d7e/init-textfile/0.log" Apr 23 09:28:00.821397 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.821371 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dzd2w_c2661932-d940-478a-a53c-e34fb0fafcf8/kube-rbac-proxy-main/0.log" Apr 23 09:28:00.854214 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.854190 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dzd2w_c2661932-d940-478a-a53c-e34fb0fafcf8/kube-rbac-proxy-self/0.log" Apr 23 09:28:00.883056 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:00.883038 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dzd2w_c2661932-d940-478a-a53c-e34fb0fafcf8/openshift-state-metrics/0.log" Apr 23 09:28:01.076648 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.076616 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5786b907-1f0c-4733-8a34-f9251530b653" path="/var/lib/kubelet/pods/5786b907-1f0c-4733-8a34-f9251530b653/volumes" Apr 23 09:28:01.204368 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.204329 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-r5k8h_55fef949-2a3f-4c26-aa87-1598dd40e0f4/prometheus-operator-admission-webhook/0.log" Apr 23 09:28:01.236368 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.236331 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59b68b5fdf-d2cb4_4aa32170-88fe-4329-9507-487a81e56002/telemeter-client/0.log" Apr 23 09:28:01.258964 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.258934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59b68b5fdf-d2cb4_4aa32170-88fe-4329-9507-487a81e56002/reload/0.log" Apr 23 09:28:01.280786 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.280752 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59b68b5fdf-d2cb4_4aa32170-88fe-4329-9507-487a81e56002/kube-rbac-proxy/0.log" Apr 23 09:28:01.313512 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.313481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57d897d645-7xzz9_8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3/thanos-query/0.log" Apr 23 09:28:01.338016 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.337954 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57d897d645-7xzz9_8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3/kube-rbac-proxy-web/0.log" Apr 23 09:28:01.361608 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.361589 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57d897d645-7xzz9_8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3/kube-rbac-proxy/0.log" Apr 23 09:28:01.385224 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.385201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57d897d645-7xzz9_8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3/prom-label-proxy/0.log" Apr 23 09:28:01.408043 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.408022 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57d897d645-7xzz9_8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3/kube-rbac-proxy-rules/0.log" Apr 23 09:28:01.434145 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:01.434123 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57d897d645-7xzz9_8d8f8720-5c0e-48fa-93b5-fe7e1fa8f4c3/kube-rbac-proxy-metrics/0.log" Apr 23 09:28:02.637566 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:02.637529 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-pzrvb_601b91da-cc29-4ccb-a0f3-c184e435eef8/networking-console-plugin/0.log" Apr 23 09:28:03.493151 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:03.493119 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-99467696b-gfgtw_1a3a79d8-6d77-452b-8aed-c19229cc6530/console/0.log" Apr 23 09:28:03.932873 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:03.932840 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-9vxvt_5ddb114e-9238-4d4e-9e66-f1b3f964c562/volume-data-source-validator/0.log" Apr 23 09:28:04.286613 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.286540 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w"] Apr 23 09:28:04.286924 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.286910 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5786b907-1f0c-4733-8a34-f9251530b653" containerName="copy" Apr 23 09:28:04.286965 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.286926 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5786b907-1f0c-4733-8a34-f9251530b653" containerName="copy" Apr 23 09:28:04.286965 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.286950 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5786b907-1f0c-4733-8a34-f9251530b653" containerName="gather" Apr 23 09:28:04.286965 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.286956 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5786b907-1f0c-4733-8a34-f9251530b653" containerName="gather" Apr 23 09:28:04.287051 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.287020 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5786b907-1f0c-4733-8a34-f9251530b653" containerName="gather" Apr 23 09:28:04.287051 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.287029 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5786b907-1f0c-4733-8a34-f9251530b653" containerName="copy" Apr 23 09:28:04.292454 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.292430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.295618 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.295595 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-29qjf\"/\"kube-root-ca.crt\"" Apr 23 09:28:04.295841 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.295733 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-29qjf\"/\"default-dockercfg-nnl8j\"" Apr 23 09:28:04.295841 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.295629 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-29qjf\"/\"openshift-service-ca.crt\"" Apr 23 09:28:04.296745 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.296716 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w"] Apr 23 09:28:04.437180 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.437143 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-lib-modules\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.437348 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.437189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gpjh\" (UniqueName: \"kubernetes.io/projected/80999363-c366-40c4-89ed-39da88630b47-kube-api-access-8gpjh\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.437348 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.437322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-sys\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.437463 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.437348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-proc\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.437463 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.437371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-podres\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538249 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-sys\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538249 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-proc\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538249 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-podres\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538249 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-lib-modules\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538590 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gpjh\" (UniqueName: \"kubernetes.io/projected/80999363-c366-40c4-89ed-39da88630b47-kube-api-access-8gpjh\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538590 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-sys\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538590 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-proc\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538590 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-podres\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.538590 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.538417 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/80999363-c366-40c4-89ed-39da88630b47-lib-modules\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.547455 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.547426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gpjh\" (UniqueName: \"kubernetes.io/projected/80999363-c366-40c4-89ed-39da88630b47-kube-api-access-8gpjh\") pod \"perf-node-gather-daemonset-q8n5w\" (UID: \"80999363-c366-40c4-89ed-39da88630b47\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.603465 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.603427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:04.723441 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.723416 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w"] Apr 23 09:28:04.725605 ip-10-0-139-48 kubenswrapper[2574]: W0423 09:28:04.725575 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod80999363_c366_40c4_89ed_39da88630b47.slice/crio-5febd5c2f487d7cb2f38b154610acb938748d763d68d25686b550fdc31466868 WatchSource:0}: Error finding container 5febd5c2f487d7cb2f38b154610acb938748d763d68d25686b550fdc31466868: Status 404 returned error can't find the container with id 5febd5c2f487d7cb2f38b154610acb938748d763d68d25686b550fdc31466868 Apr 23 09:28:04.729707 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.729679 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fxlql_67ac0c42-3257-496f-9fbd-98d7b980d7d5/dns/0.log" Apr 23 09:28:04.755625 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.755594 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fxlql_67ac0c42-3257-496f-9fbd-98d7b980d7d5/kube-rbac-proxy/0.log" Apr 23 09:28:04.950876 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:04.950844 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ffdbq_e556abe8-644b-4251-99a6-a109bfc8c173/dns-node-resolver/0.log" Apr 23 09:28:05.517498 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:05.517475 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-px7gw_3b1aa913-059c-4ed1-9c77-7a25bcbdb7f0/node-ca/0.log" Apr 23 09:28:05.526445 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:05.526415 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" event={"ID":"80999363-c366-40c4-89ed-39da88630b47","Type":"ContainerStarted","Data":"75efd5842cd04dacae18794be3cf6704cd97292d32fc70381c5bcf0258ed703f"} Apr 23 09:28:05.526583 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:05.526452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" event={"ID":"80999363-c366-40c4-89ed-39da88630b47","Type":"ContainerStarted","Data":"5febd5c2f487d7cb2f38b154610acb938748d763d68d25686b550fdc31466868"} Apr 23 09:28:05.526583 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:05.526534 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:05.545841 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:05.545806 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" podStartSLOduration=1.545794567 podStartE2EDuration="1.545794567s" podCreationTimestamp="2026-04-23 09:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:28:05.543649212 +0000 UTC m=+2397.060508538" watchObservedRunningTime="2026-04-23 09:28:05.545794567 +0000 UTC m=+2397.062653902" Apr 23 09:28:06.773668 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:06.773641 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pnnnr_85967c1f-fcc5-477d-949f-62d16a82fb18/serve-healthcheck-canary/0.log" Apr 23 09:28:07.140799 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:07.140765 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wsgmm_976abe21-f40b-42ec-b745-de58f1628c36/insights-operator/0.log" Apr 23 09:28:07.141148 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:07.141132 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wsgmm_976abe21-f40b-42ec-b745-de58f1628c36/insights-operator/1.log" Apr 23 09:28:07.159187 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:07.159166 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g996f_d223acb1-cce5-46a5-9460-64ee56b66cf1/kube-rbac-proxy/0.log" Apr 23 09:28:07.180345 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:07.180318 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g996f_d223acb1-cce5-46a5-9460-64ee56b66cf1/exporter/0.log" Apr 23 09:28:07.205177 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:07.205160 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g996f_d223acb1-cce5-46a5-9460-64ee56b66cf1/extractor/0.log" Apr 23 09:28:09.005823 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:09.005799 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-t8l5w_98dd6dad-70fe-4607-ac31-47a0875fb851/jobset-operator/0.log" Apr 23 09:28:09.159924 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:09.159837 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:28:09.177219 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:09.177198 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:28:11.541044 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:11.541016 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-q8n5w" Apr 23 09:28:12.817353 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:12.817320 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4g296_eba944cc-6425-4433-beef-e901be182078/kube-storage-version-migrator-operator/1.log" Apr 23 09:28:12.818260 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:12.818227 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4g296_eba944cc-6425-4433-beef-e901be182078/kube-storage-version-migrator-operator/0.log" Apr 23 09:28:13.728597 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.728526 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zvll_3e5f1db4-0744-40e3-98f8-2f123ac8d9ab/kube-multus/0.log" Apr 23 09:28:13.779834 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.779808 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27dts_a619dfec-0cfa-46a8-9c86-7b06735feeaa/kube-multus-additional-cni-plugins/0.log" Apr 23 09:28:13.803113 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.803089 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27dts_a619dfec-0cfa-46a8-9c86-7b06735feeaa/egress-router-binary-copy/0.log" Apr 23 09:28:13.830599 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.830575 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27dts_a619dfec-0cfa-46a8-9c86-7b06735feeaa/cni-plugins/0.log" Apr 23 09:28:13.862689 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.862670 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27dts_a619dfec-0cfa-46a8-9c86-7b06735feeaa/bond-cni-plugin/0.log" Apr 23 09:28:13.912812 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.912791 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27dts_a619dfec-0cfa-46a8-9c86-7b06735feeaa/routeoverride-cni/0.log" Apr 23 09:28:13.943264 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.943243 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27dts_a619dfec-0cfa-46a8-9c86-7b06735feeaa/whereabouts-cni-bincopy/0.log" Apr 23 09:28:13.975644 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:13.975626 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27dts_a619dfec-0cfa-46a8-9c86-7b06735feeaa/whereabouts-cni/0.log" Apr 23 09:28:14.896313 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:14.896280 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wnmff_b87e8c66-90ff-454c-9c82-3fe28797e8df/network-metrics-daemon/0.log" Apr 23 09:28:14.929251 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:14.929229 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wnmff_b87e8c66-90ff-454c-9c82-3fe28797e8df/kube-rbac-proxy/0.log" Apr 23 09:28:16.486005 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.485978 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-controller/0.log" Apr 23 09:28:16.509523 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.509489 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/0.log" Apr 23 09:28:16.519962 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.519938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovn-acl-logging/1.log" Apr 23 09:28:16.542359 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.542333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/kube-rbac-proxy-node/0.log" Apr 23 09:28:16.578731 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.578704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:28:16.605981 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.605962 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/northd/0.log" Apr 23 09:28:16.638926 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.638905 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/nbdb/0.log" Apr 23 09:28:16.677316 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.677297 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/sbdb/0.log" Apr 23 09:28:16.784520 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:16.784451 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k9zxl_7f9090b7-c998-4b54-9c85-0df5e37b4d9d/ovnkube-controller/0.log" Apr 23 09:28:18.016920 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:18.016892 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-wttb6_7e40391b-c30d-4041-b6cf-e4b9e590a499/check-endpoints/0.log" Apr 23 09:28:18.155252 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:18.155218 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xksm7_96a803a2-c0b4-4975-b7d2-ca3157aa9f00/network-check-target-container/0.log" Apr 23 09:28:19.236731 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:19.236704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hv5wx_cf591c15-c055-4b83-8692-b39b3dd9ece5/iptables-alerter/0.log" Apr 23 09:28:20.255029 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:20.255006 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-pvr56_e45e878a-2315-478d-a452-625c243a29cc/tuned/0.log" Apr 23 09:28:23.221700 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:23.221667 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-cvwjc_6ef9bf61-ead5-4bec-bd04-a167a6f7321f/service-ca-operator/1.log" Apr 23 09:28:23.222752 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:23.222733 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-cvwjc_6ef9bf61-ead5-4bec-bd04-a167a6f7321f/service-ca-operator/0.log" Apr 23 09:28:23.548262 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:23.548192 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-nvnmg_61032a47-c259-42d4-9698-cb6a746a73f0/service-ca-controller/0.log" Apr 23 09:28:23.915118 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:23.915089 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7lpk2_fcf385b3-bad6-4ba8-ad14-e86462762f6a/csi-driver/0.log" Apr 23 09:28:23.937478 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:23.937453 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7lpk2_fcf385b3-bad6-4ba8-ad14-e86462762f6a/csi-node-driver-registrar/0.log" Apr 23 09:28:23.959968 ip-10-0-139-48 kubenswrapper[2574]: I0423 09:28:23.959950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7lpk2_fcf385b3-bad6-4ba8-ad14-e86462762f6a/csi-liveness-probe/0.log"