Apr 16 16:29:07.284943 ip-10-0-139-12 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:29:07.744512 ip-10-0-139-12 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:29:07.744512 ip-10-0-139-12 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:29:07.744512 ip-10-0-139-12 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:29:07.744512 ip-10-0-139-12 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:29:07.744512 ip-10-0-139-12 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:29:07.748579 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.748488 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:29:07.754166 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754142 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:07.754166 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754164 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:07.754166 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754169 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:07.754166 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754172 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754175 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754179 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754182 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754184 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754187 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754190 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754192 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754195 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754198 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754201 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754203 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754207 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754211 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754214 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754217 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754220 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754222 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754225 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754233 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:07.754337 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754236 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754238 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754241 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754243 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754246 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754249 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754251 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754254 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754257 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754259 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754262 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754264 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754267 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754270 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754273 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754276 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754278 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754281 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754284 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754287 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:07.754826 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754290 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754292 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754295 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754297 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754300 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754302 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754305 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754308 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754310 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754314 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754316 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754319 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754322 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754324 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754327 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754330 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754333 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754335 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754338 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754341 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:07.755347 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754344 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754346 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754349 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754352 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754356 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754358 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754362 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754364 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754367 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754370 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754373 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754375 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754378 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754381 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754384 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754387 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754390 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754393 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754396 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754399 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:07.755888 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754401 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754404 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754406 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754833 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754838 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754842 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754845 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754848 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754851 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754855 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754858 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754861 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754863 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754866 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754869 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754871 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754874 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754877 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754882 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:07.756403 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754885 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754888 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754891 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754893 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754896 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754899 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754902 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754904 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754908 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754910 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754913 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754916 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754918 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754921 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754924 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754928 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754932 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754935 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754938 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:07.756889 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754941 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754944 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754947 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754950 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754953 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754956 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754959 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754962 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754965 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754968 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754971 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754973 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754977 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754979 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754982 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754984 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754987 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754990 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754992 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754995 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.754997 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:07.757357 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755000 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755003 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755006 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755009 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755011 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755014 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755016 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755019 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755022 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755024 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755028 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755030 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755033 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755036 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755038 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755041 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755044 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755046 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755049 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755051 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:07.757894 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755054 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755057 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755060 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755062 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755066 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755068 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755071 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755074 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755077 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.755079 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755811 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755820 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755830 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755835 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755840 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755844 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755849 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755853 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755857 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755860 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755864 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755867 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:29:07.758380 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755871 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755874 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755878 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755881 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755884 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755887 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755890 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755894 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755897 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755900 2573 flags.go:64] FLAG: --config-dir="" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755903 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755907 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755911 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755914 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755918 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755921 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755925 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755928 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755931 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755935 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755938 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755942 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755945 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755948 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755951 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:29:07.758932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755954 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755957 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755962 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755965 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755968 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755972 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755974 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755978 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755982 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755985 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755988 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755991 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755994 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.755997 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756000 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756003 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756006 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756009 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756013 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756016 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756019 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756022 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756026 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756029 2573 flags.go:64] FLAG: --help="false" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756032 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-139-12.ec2.internal" Apr 16 16:29:07.759552 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756035 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756038 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756041 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756045 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756048 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756051 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756054 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756056 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756060 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756063 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756069 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756072 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756075 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756078 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756081 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756084 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756087 2573 flags.go:64] FLAG: --lock-file="" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756090 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756093 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756096 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756101 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756105 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756107 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:29:07.760174 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756111 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756114 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756117 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756120 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756123 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756128 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756157 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756162 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756166 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756169 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756172 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756175 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756179 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756181 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756184 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756193 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756196 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756198 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756202 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756207 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756213 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756216 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756219 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756222 2573 flags.go:64] FLAG: --port="10250" Apr 16 16:29:07.760774 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756225 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756228 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a9d8752b1d19095d" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756232 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756235 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756238 2573 flags.go:64] FLAG: --register-node="true" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756241 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756244 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756248 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756251 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756254 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756257 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756261 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756265 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756286 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756292 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756296 2573 flags.go:64] FLAG: --runonce="false" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756299 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756302 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756305 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756308 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756311 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756314 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756317 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756321 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756324 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756326 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:29:07.761345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756329 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756333 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756338 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756341 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756344 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756350 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756353 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756356 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756360 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756363 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756366 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756369 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756372 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756375 2573 flags.go:64] FLAG: --v="2" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756379 2573 flags.go:64] FLAG: --version="false" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756383 2573 flags.go:64] FLAG: --vmodule="" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756388 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.756391 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756491 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756496 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756503 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756506 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756509 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756512 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:07.761984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756515 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756518 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756521 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756524 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756529 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756532 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756535 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756538 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756541 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756544 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756549 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756552 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756554 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756557 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756560 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756563 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756565 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756568 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756571 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756573 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:07.762596 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756576 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756579 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756581 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756584 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756587 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756590 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756592 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756595 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756598 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756601 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756604 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756607 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756609 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756612 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756614 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756617 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756620 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756622 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756625 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756627 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:07.763147 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756630 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756633 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756637 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756640 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756642 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756645 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756661 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756664 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756666 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756669 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756672 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756675 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756678 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756680 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756683 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756686 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756688 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756691 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756694 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:07.763642 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756696 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756702 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756705 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756709 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756711 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756714 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756717 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756719 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756722 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756725 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756727 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756730 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756733 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756735 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756739 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756743 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756745 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756749 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756752 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:07.764117 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756754 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.756757 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.757600 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.764064 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.764079 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764127 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764140 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764145 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764148 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764151 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764154 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764158 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764161 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764164 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764167 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:07.764591 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764169 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764172 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764175 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764178 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764180 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764183 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764186 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764188 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764191 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764194 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764197 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764200 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764202 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764205 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764208 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764212 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764214 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764217 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764219 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:07.764984 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764222 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764225 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764229 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764231 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764234 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764237 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764240 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764242 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764245 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764247 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764250 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764253 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764256 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764258 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764261 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764264 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764266 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764269 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764271 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764274 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:07.765458 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764276 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764279 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764281 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764284 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764287 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764289 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764293 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764296 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764299 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764302 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764305 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764308 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764311 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764313 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764317 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764320 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764322 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764325 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764328 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764330 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:07.765994 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764333 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764335 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764338 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764340 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764343 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764346 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764348 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764351 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764353 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764356 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764359 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764361 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764364 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764366 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764369 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764373 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:07.766476 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764376 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.764382 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764484 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764488 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764491 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764494 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764498 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764500 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764503 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764506 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764509 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764512 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764515 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764517 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764520 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764523 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:29:07.766887 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764525 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764528 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764530 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764533 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764535 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764538 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764541 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764543 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764546 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764548 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764551 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764553 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764556 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764560 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764563 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764566 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764569 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764572 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764575 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:29:07.767275 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764578 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764581 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764584 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764587 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764590 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764593 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764596 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764599 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764601 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764604 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764607 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764609 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764612 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764615 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764617 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764620 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764622 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764625 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764627 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:29:07.767749 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764630 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764633 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764635 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764638 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764640 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764643 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764645 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764664 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764668 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764672 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764674 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764677 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764680 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764683 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764686 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764688 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764692 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764696 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764699 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764702 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:29:07.768215 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764705 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764707 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764710 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764712 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764715 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764717 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764720 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764722 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764725 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764727 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764730 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764732 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764735 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:07.764738 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.764743 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:29:07.768730 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.765489 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:29:07.771164 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.771149 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:29:07.772030 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.772019 2573 server.go:1019] "Starting client certificate rotation" Apr 16 16:29:07.772138 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.772113 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:29:07.772181 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.772156 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:29:07.799841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.799819 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:29:07.804241 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.804221 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:29:07.822883 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.822784 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:29:07.830062 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.830045 2573 log.go:25] "Validated CRI v1 image API" Apr 16 16:29:07.830366 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.830349 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:29:07.831312 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.831296 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:29:07.834173 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.834149 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c5b4bd44-6125-4ce3-bffd-64967db81271:/dev/nvme0n1p4 ccfcd7b2-aac5-4f87-bac1-e74078443eef:/dev/nvme0n1p3] Apr 16 16:29:07.834246 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.834177 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:29:07.839236 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.839123 2573 manager.go:217] Machine: {Timestamp:2026-04-16 16:29:07.837988574 +0000 UTC m=+0.422057314 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3090579 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23f244c51cb66fbdaaccc17829cd58 SystemUUID:ec23f244-c51c-b66f-bdaa-ccc17829cd58 BootID:1644168e-f279-4f4a-b1d5-101202484bfa Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ae:5d:0b:69:a5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ae:5d:0b:69:a5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:af:f1:c3:e5:95 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:29:07.839236 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.839225 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:29:07.839393 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.839336 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:29:07.841048 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.841025 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:29:07.841239 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.841052 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-12.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:29:07.841323 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.841252 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:29:07.841323 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.841265 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:29:07.841323 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.841284 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:29:07.844302 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.844289 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:29:07.845751 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.845738 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:29:07.845878 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.845867 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:29:07.848017 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.848005 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:29:07.848075 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.848024 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:29:07.848075 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.848040 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:29:07.848075 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.848053 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:29:07.848075 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.848066 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:29:07.849211 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.849198 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:29:07.849282 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.849222 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:29:07.852510 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.852493 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:29:07.854365 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.854352 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:29:07.856061 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856047 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:29:07.856122 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856075 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:29:07.856122 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856083 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:29:07.856122 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856091 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:29:07.856122 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856096 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:29:07.856122 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856104 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:29:07.856122 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856111 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:29:07.856389 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856145 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:29:07.856389 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856158 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:29:07.856389 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856169 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:29:07.856389 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856194 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:29:07.856550 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.856504 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:29:07.857478 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.857469 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:29:07.857478 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.857478 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:29:07.861051 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.861037 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:29:07.861132 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.861071 2573 server.go:1295] "Started kubelet" Apr 16 16:29:07.861216 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.861153 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:29:07.861288 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.861236 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:29:07.861370 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.861357 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:29:07.862084 ip-10-0-139-12 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:29:07.862457 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.862066 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-12.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:29:07.862457 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.862133 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:29:07.862457 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.862196 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-12.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:29:07.863048 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.863024 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:29:07.863048 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.863046 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:29:07.866577 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.865708 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-12.ec2.internal.18a6e33b878e101f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-12.ec2.internal,UID:ip-10-0-139-12.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-12.ec2.internal,},FirstTimestamp:2026-04-16 16:29:07.861049375 +0000 UTC m=+0.445118115,LastTimestamp:2026-04-16 16:29:07.861049375 +0000 UTC m=+0.445118115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-12.ec2.internal,}" Apr 16 16:29:07.869286 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.869269 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:29:07.869761 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.869744 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9jb79" Apr 16 16:29:07.869939 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.869927 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:29:07.872478 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.872460 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:29:07.872581 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.872571 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:29:07.872676 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.872633 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:07.872825 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.872812 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:29:07.872879 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.872837 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:29:07.873469 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.873449 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:29:07.874144 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.874121 2573 factory.go:55] Registering systemd factory Apr 16 16:29:07.874144 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.874141 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:29:07.874482 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.874462 2573 factory.go:153] Registering CRI-O factory Apr 16 16:29:07.874578 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.874487 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 16:29:07.874637 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.874587 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:29:07.874637 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.874634 2573 factory.go:103] Registering Raw factory Apr 16 16:29:07.874755 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.874665 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 16:29:07.875570 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.875550 2573 manager.go:319] Starting recovery of all containers Apr 16 16:29:07.876411 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.876338 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:29:07.876992 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.876972 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9jb79" Apr 16 16:29:07.882480 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.882461 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:07.884849 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.884820 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-12.ec2.internal\" not found" node="ip-10-0-139-12.ec2.internal" Apr 16 16:29:07.887780 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.887529 2573 manager.go:324] Recovery completed Apr 16 16:29:07.891696 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.891683 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:07.894074 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.894055 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:07.894139 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.894089 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:07.894139 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.894103 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:07.894641 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.894627 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:29:07.894641 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.894637 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:29:07.894743 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.894675 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:29:07.897412 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.897401 2573 policy_none.go:49] "None policy: Start" Apr 16 16:29:07.897453 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.897415 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:29:07.897453 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.897425 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:29:07.938212 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.938196 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.938232 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.938246 2573 server.go:85] "Starting device plugin registration server" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.938471 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.938481 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.938588 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.938679 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.938687 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.939320 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:29:07.943397 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.939353 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:07.970792 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.970762 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:29:07.971999 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.971980 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:29:07.972108 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.972007 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:29:07.972108 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.972025 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:29:07.972108 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.972032 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:29:07.972108 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:07.972066 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:29:07.975356 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:07.975341 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:08.038960 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.038874 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:08.039867 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.039851 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:08.039964 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.039905 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:08.039964 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.039920 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:08.039964 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.039958 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.048841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.048821 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.048944 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.048846 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-12.ec2.internal\": node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.060704 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.060688 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.072480 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.072433 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal"] Apr 16 16:29:08.072571 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.072518 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:08.073361 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.073347 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:08.073427 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.073375 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:08.073427 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.073386 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:08.074674 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.074662 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:08.074826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.074813 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.074863 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.074842 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:08.075349 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.075333 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:08.075396 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.075361 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:08.075396 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.075371 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:08.075457 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.075340 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:08.075457 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.075435 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:08.075457 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.075445 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:08.077109 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.077094 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.077165 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.077128 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:29:08.077779 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.077764 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:29:08.077837 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.077794 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:29:08.077837 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.077807 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:29:08.106301 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.106275 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-12.ec2.internal\" not found" node="ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.110456 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.110440 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-12.ec2.internal\" not found" node="ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.160814 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.160787 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.174205 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.174184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7fb776f437685f75160db2a241f9a93e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal\" (UID: \"7fb776f437685f75160db2a241f9a93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.174281 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.174210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fb776f437685f75160db2a241f9a93e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal\" (UID: \"7fb776f437685f75160db2a241f9a93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.261346 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.261307 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.274690 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.274665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7fb776f437685f75160db2a241f9a93e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal\" (UID: \"7fb776f437685f75160db2a241f9a93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.274737 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.274697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fb776f437685f75160db2a241f9a93e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal\" (UID: \"7fb776f437685f75160db2a241f9a93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.274737 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.274719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f2b918a444bf444231f9bc2675345c3c-config\") pod \"kube-apiserver-proxy-ip-10-0-139-12.ec2.internal\" (UID: \"f2b918a444bf444231f9bc2675345c3c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.274800 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.274790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7fb776f437685f75160db2a241f9a93e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal\" (UID: \"7fb776f437685f75160db2a241f9a93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.274855 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.274836 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fb776f437685f75160db2a241f9a93e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal\" (UID: \"7fb776f437685f75160db2a241f9a93e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.362097 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.362031 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.375438 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.375410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f2b918a444bf444231f9bc2675345c3c-config\") pod \"kube-apiserver-proxy-ip-10-0-139-12.ec2.internal\" (UID: \"f2b918a444bf444231f9bc2675345c3c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.375492 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.375466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f2b918a444bf444231f9bc2675345c3c-config\") pod \"kube-apiserver-proxy-ip-10-0-139-12.ec2.internal\" (UID: \"f2b918a444bf444231f9bc2675345c3c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.410581 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.410555 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.413164 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.413148 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.463084 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.463055 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.563690 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.563661 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.664307 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.664247 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.764939 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.764917 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.772184 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.772166 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:29:08.772319 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.772302 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:29:08.772357 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.772329 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:29:08.865021 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:08.864997 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-12.ec2.internal\" not found" Apr 16 16:29:08.870072 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.870055 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:29:08.879644 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.879610 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:24:07 +0000 UTC" deadline="2027-11-14 16:40:14.336311597 +0000 UTC" Apr 16 16:29:08.879763 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.879646 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13848h11m5.456670082s" Apr 16 16:29:08.883853 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.883836 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:29:08.901604 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.901568 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5fqhg" Apr 16 16:29:08.909508 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.909488 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5fqhg" Apr 16 16:29:08.923179 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.923129 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:08.971235 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.971211 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.983735 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.983710 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:29:08.985418 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.985402 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" Apr 16 16:29:08.992942 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:08.992925 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:29:09.001934 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:09.001901 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fb776f437685f75160db2a241f9a93e.slice/crio-40f66cba4e4487b2509d890f681be881e1b9c73ba41d68673358def017c870b2 WatchSource:0}: Error finding container 40f66cba4e4487b2509d890f681be881e1b9c73ba41d68673358def017c870b2: Status 404 returned error can't find the container with id 40f66cba4e4487b2509d890f681be881e1b9c73ba41d68673358def017c870b2 Apr 16 16:29:09.002713 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:09.002690 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2b918a444bf444231f9bc2675345c3c.slice/crio-5c74141702ef31cfd4cf8f0009b3251e3286320a0741864a8cc763819d657c8e WatchSource:0}: Error finding container 5c74141702ef31cfd4cf8f0009b3251e3286320a0741864a8cc763819d657c8e: Status 404 returned error can't find the container with id 5c74141702ef31cfd4cf8f0009b3251e3286320a0741864a8cc763819d657c8e Apr 16 16:29:09.006242 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.006228 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:29:09.337403 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.337325 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:09.792680 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.792596 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:09.821437 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.821409 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:29:09.849522 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.849498 2573 apiserver.go:52] "Watching apiserver" Apr 16 16:29:09.856474 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.856449 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:29:09.857845 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.857821 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vrlk5","openshift-network-operator/iptables-alerter-26g6g","kube-system/konnectivity-agent-pgpjj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k","openshift-cluster-node-tuning-operator/tuned-6t9jb","openshift-multus/multus-additional-cni-plugins-7vvmn","openshift-network-diagnostics/network-check-target-xzlm9","openshift-ovn-kubernetes/ovnkube-node-sg9b4","kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal","openshift-image-registry/node-ca-pkd9h","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal","openshift-multus/multus-7xh5k"] Apr 16 16:29:09.860496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.860473 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:09.860606 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:09.860581 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:09.864826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.864802 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:09.867126 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.867097 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.867640 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.867622 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cqrjn\"" Apr 16 16:29:09.867754 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.867632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:29:09.867754 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.867746 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:29:09.869519 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.869501 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.870103 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.870082 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:29:09.871502 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.870555 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:29:09.871502 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.870738 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:29:09.871502 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.871082 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-82746\"" Apr 16 16:29:09.871502 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.871143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.873128 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.872945 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:29:09.873862 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.873613 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4wxsw\"" Apr 16 16:29:09.873862 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.873615 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:29:09.873862 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.873718 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:29:09.873862 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.873826 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gl475\"" Apr 16 16:29:09.874599 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.873996 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:29:09.874599 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.874108 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:29:09.877198 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.877178 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.877300 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.877280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:09.877390 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:09.877345 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:09.879784 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.879765 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:29:09.880009 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.879992 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:29:09.881077 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.880737 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:29:09.881077 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.880781 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:29:09.881077 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.880798 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:29:09.881077 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.880737 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qrl64\"" Apr 16 16:29:09.882241 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.882142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.883275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-system-cni-dir\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.883362 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883290 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ll2j\" (UniqueName: \"kubernetes.io/projected/34f77f7d-b36d-4041-8328-4e6971055b19-kube-api-access-6ll2j\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.883362 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883319 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f931d8c4-2e7a-44a0-8e0b-ee392349e62d-konnectivity-ca\") pod \"konnectivity-agent-pgpjj\" (UID: \"f931d8c4-2e7a-44a0-8e0b-ee392349e62d\") " pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:09.883481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-registration-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.883481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883410 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hdkt\" (UniqueName: \"kubernetes.io/projected/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-kube-api-access-8hdkt\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.883481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.883481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-host\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.883703 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883504 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-socket-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.883703 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-var-lib-kubelet\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.883703 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db546d9e-9684-452f-876d-0d945addae4c-host-slash\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.883703 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-modprobe-d\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.883703 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.883703 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-cnibin\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-cni-binary-copy\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysctl-d\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysctl-conf\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-run\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-os-release\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/db546d9e-9684-452f-876d-0d945addae4c-iptables-alerter-script\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-device-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn7qg\" (UniqueName: \"kubernetes.io/projected/2b15cd02-f09c-4e86-a518-ab8f11540068-kube-api-access-dn7qg\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.884006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.883981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-kubernetes\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-lib-modules\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-tuned\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmx7x\" (UniqueName: \"kubernetes.io/projected/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-kube-api-access-pmx7x\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f931d8c4-2e7a-44a0-8e0b-ee392349e62d-agent-certs\") pod \"konnectivity-agent-pgpjj\" (UID: \"f931d8c4-2e7a-44a0-8e0b-ee392349e62d\") " pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-etc-selinux\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884114 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884179 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysconfig\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884242 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-sys\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-tmp\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4h5c\" (UniqueName: \"kubernetes.io/projected/db546d9e-9684-452f-876d-0d945addae4c-kube-api-access-p4h5c\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884437 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-sys-fs\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884464 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884474 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.884481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-systemd\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.885294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884810 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:29:09.885294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884857 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:29:09.885294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884943 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-w97np\"" Apr 16 16:29:09.885294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.884987 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:29:09.885294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.885050 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:29:09.886486 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.886437 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rvtq8\"" Apr 16 16:29:09.887152 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.886726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5p66s\"" Apr 16 16:29:09.887152 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.886768 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:29:09.887152 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.886995 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:29:09.887152 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.887056 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:29:09.887361 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.887318 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:29:09.910336 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.910308 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:24:08 +0000 UTC" deadline="2027-10-19 20:42:28.116500212 +0000 UTC" Apr 16 16:29:09.910419 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.910336 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13228h13m18.206168253s" Apr 16 16:29:09.974516 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.974481 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:29:09.976230 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.976188 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" event={"ID":"7fb776f437685f75160db2a241f9a93e","Type":"ContainerStarted","Data":"40f66cba4e4487b2509d890f681be881e1b9c73ba41d68673358def017c870b2"} Apr 16 16:29:09.977163 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.977139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" event={"ID":"f2b918a444bf444231f9bc2675345c3c","Type":"ContainerStarted","Data":"5c74141702ef31cfd4cf8f0009b3251e3286320a0741864a8cc763819d657c8e"} Apr 16 16:29:09.985517 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-var-lib-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.985609 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-ovnkube-config\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.985609 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-host\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.985609 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-socket-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.985609 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-var-lib-kubelet\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-cnibin\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985674 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db546d9e-9684-452f-876d-0d945addae4c-host-slash\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-host\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546v4\" (UniqueName: \"kubernetes.io/projected/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-kube-api-access-546v4\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-cni-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985740 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-var-lib-kubelet\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-netns\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-socket-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db546d9e-9684-452f-876d-0d945addae4c-host-slash\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.985829 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985820 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-etc-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-cni-binary-copy\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6839f3b4-61f8-4813-988a-8f23df71a050-cni-binary-copy\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvl8f\" (UniqueName: \"kubernetes.io/projected/f0049088-3d94-496a-ba74-286b0c64fa8a-kube-api-access-kvl8f\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.985977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysctl-d\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-os-release\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-hostroot\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/db546d9e-9684-452f-876d-0d945addae4c-iptables-alerter-script\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-os-release\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn7qg\" (UniqueName: \"kubernetes.io/projected/2b15cd02-f09c-4e86-a518-ab8f11540068-kube-api-access-dn7qg\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-kubernetes\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysctl-d\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-lib-modules\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986275 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-systemd-units\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-env-overrides\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-lib-modules\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-kubernetes\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmx7x\" (UniqueName: \"kubernetes.io/projected/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-kube-api-access-pmx7x\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f931d8c4-2e7a-44a0-8e0b-ee392349e62d-agent-certs\") pod \"konnectivity-agent-pgpjj\" (UID: \"f931d8c4-2e7a-44a0-8e0b-ee392349e62d\") " pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986413 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysconfig\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986434 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-sys\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-cni-binary-copy\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-sys\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysconfig\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-tmp\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-kubelet\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-node-log\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-systemd\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.986932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ll2j\" (UniqueName: \"kubernetes.io/projected/34f77f7d-b36d-4041-8328-4e6971055b19-kube-api-access-6ll2j\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-host\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-systemd\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-cni-netd\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-registration-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-ovnkube-script-lib\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986843 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-os-release\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986846 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-registration-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-socket-dir-parent\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-cni-bin\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-cni-multus\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.986990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-multus-certs\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-modprobe-d\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-slash\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.987704 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/db546d9e-9684-452f-876d-0d945addae4c-iptables-alerter-script\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-cnibin\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-cnibin\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987159 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-system-cni-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-modprobe-d\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-kubelet\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6839f3b4-61f8-4813-988a-8f23df71a050-multus-daemon-config\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987293 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-systemd\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987383 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-log-socket\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0049088-3d94-496a-ba74-286b0c64fa8a-ovn-node-metrics-cert\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysctl-conf\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-run\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-serviceca\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987536 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-k8s-cni-cncf-io\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.988496 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-device-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-tuned\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-device-dir\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-sysctl-conf\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-cni-bin\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-etc-selinux\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987736 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-run\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-run-netns\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-etc-selinux\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987825 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4h5c\" (UniqueName: \"kubernetes.io/projected/db546d9e-9684-452f-876d-0d945addae4c-kube-api-access-p4h5c\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-sys-fs\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-system-cni-dir\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:09.987964 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.987993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-conf-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.989160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-etc-kubernetes\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-ovn\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:09.988079 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs podName:bb3c2920-54bb-42cd-a7e2-0c22345e99b4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:10.48804132 +0000 UTC m=+3.072110052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs") pod "network-metrics-daemon-vrlk5" (UID: "bb3c2920-54bb-42cd-a7e2-0c22345e99b4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f77f7d-b36d-4041-8328-4e6971055b19-system-cni-dir\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2b15cd02-f09c-4e86-a518-ab8f11540068-sys-fs\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f931d8c4-2e7a-44a0-8e0b-ee392349e62d-konnectivity-ca\") pod \"konnectivity-agent-pgpjj\" (UID: \"f931d8c4-2e7a-44a0-8e0b-ee392349e62d\") " pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hdkt\" (UniqueName: \"kubernetes.io/projected/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-kube-api-access-8hdkt\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34f77f7d-b36d-4041-8328-4e6971055b19-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4fgn\" (UniqueName: \"kubernetes.io/projected/6839f3b4-61f8-4813-988a-8f23df71a050-kube-api-access-j4fgn\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:09.989826 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.988716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f931d8c4-2e7a-44a0-8e0b-ee392349e62d-konnectivity-ca\") pod \"konnectivity-agent-pgpjj\" (UID: \"f931d8c4-2e7a-44a0-8e0b-ee392349e62d\") " pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:09.990247 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.990227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-tmp\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.990247 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.990241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-etc-tuned\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:09.990428 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:09.990409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f931d8c4-2e7a-44a0-8e0b-ee392349e62d-agent-certs\") pod \"konnectivity-agent-pgpjj\" (UID: \"f931d8c4-2e7a-44a0-8e0b-ee392349e62d\") " pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:10.000841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.000812 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4h5c\" (UniqueName: \"kubernetes.io/projected/db546d9e-9684-452f-876d-0d945addae4c-kube-api-access-p4h5c\") pod \"iptables-alerter-26g6g\" (UID: \"db546d9e-9684-452f-876d-0d945addae4c\") " pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:10.000952 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.000913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn7qg\" (UniqueName: \"kubernetes.io/projected/2b15cd02-f09c-4e86-a518-ab8f11540068-kube-api-access-dn7qg\") pod \"aws-ebs-csi-driver-node-ccn5k\" (UID: \"2b15cd02-f09c-4e86-a518-ab8f11540068\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:10.001266 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.001241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ll2j\" (UniqueName: \"kubernetes.io/projected/34f77f7d-b36d-4041-8328-4e6971055b19-kube-api-access-6ll2j\") pod \"multus-additional-cni-plugins-7vvmn\" (UID: \"34f77f7d-b36d-4041-8328-4e6971055b19\") " pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:10.001534 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.001511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmx7x\" (UniqueName: \"kubernetes.io/projected/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-kube-api-access-pmx7x\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:10.001701 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.001680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hdkt\" (UniqueName: \"kubernetes.io/projected/8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e-kube-api-access-8hdkt\") pod \"tuned-6t9jb\" (UID: \"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e\") " pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:10.088871 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.088792 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-cnibin\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.088871 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.088843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-546v4\" (UniqueName: \"kubernetes.io/projected/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-kube-api-access-546v4\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:10.089084 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.088927 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-cnibin\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089084 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.088978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-cni-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089084 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-netns\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089084 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-etc-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089084 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6839f3b4-61f8-4813-988a-8f23df71a050-cni-binary-copy\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089084 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvl8f\" (UniqueName: \"kubernetes.io/projected/f0049088-3d94-496a-ba74-286b0c64fa8a-kube-api-access-kvl8f\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089321 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-netns\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089321 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-hostroot\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089321 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-etc-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089321 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-cni-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089321 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-hostroot\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.089321 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-systemd-units\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-env-overrides\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-systemd-units\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-kubelet\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089418 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-node-log\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-kubelet\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-node-log\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089509 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-host\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-cni-netd\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089552 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-host\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:10.089580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089568 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-ovnkube-script-lib\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089601 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-cni-netd\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089602 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-os-release\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6839f3b4-61f8-4813-988a-8f23df71a050-cni-binary-copy\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-socket-dir-parent\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089734 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-cni-bin\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-os-release\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-cni-multus\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089804 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-cni-multus\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089814 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-socket-dir-parent\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-multus-certs\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-cni-bin\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-multus-certs\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-slash\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-env-overrides\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-system-cni-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-slash\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090064 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-kubelet\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089956 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6839f3b4-61f8-4813-988a-8f23df71a050-multus-daemon-config\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089965 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-system-cni-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.089998 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-var-lib-kubelet\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-systemd\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-log-socket\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090062 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-systemd\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0049088-3d94-496a-ba74-286b0c64fa8a-ovn-node-metrics-cert\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090092 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-ovnkube-script-lib\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-serviceca\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-log-socket\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-k8s-cni-cncf-io\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-host-run-k8s-cni-cncf-io\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-cni-bin\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-run-netns\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090273 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.090818 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-conf-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-cni-bin\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-host-run-netns\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-etc-kubernetes\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-ovn\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-multus-conf-dir\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4fgn\" (UniqueName: \"kubernetes.io/projected/6839f3b4-61f8-4813-988a-8f23df71a050-kube-api-access-j4fgn\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6839f3b4-61f8-4813-988a-8f23df71a050-etc-kubernetes\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-var-lib-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-run-ovn\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6839f3b4-61f8-4813-988a-8f23df71a050-multus-daemon-config\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-ovnkube-config\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0049088-3d94-496a-ba74-286b0c64fa8a-var-lib-openvswitch\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-serviceca\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:10.091542 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.090925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0049088-3d94-496a-ba74-286b0c64fa8a-ovnkube-config\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.092669 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.092636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0049088-3d94-496a-ba74-286b0c64fa8a-ovn-node-metrics-cert\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.096063 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.096045 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:10.096183 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.096070 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:10.096183 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.096084 2573 projected.go:194] Error preparing data for projected volume kube-api-access-df57k for pod openshift-network-diagnostics/network-check-target-xzlm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:10.096183 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.096160 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k podName:6e5b8c51-4e68-4bf7-bb21-ec500c600441 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:10.596144077 +0000 UTC m=+3.180212825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-df57k" (UniqueName: "kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k") pod "network-check-target-xzlm9" (UID: "6e5b8c51-4e68-4bf7-bb21-ec500c600441") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:10.097567 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.097547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvl8f\" (UniqueName: \"kubernetes.io/projected/f0049088-3d94-496a-ba74-286b0c64fa8a-kube-api-access-kvl8f\") pod \"ovnkube-node-sg9b4\" (UID: \"f0049088-3d94-496a-ba74-286b0c64fa8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.098105 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.098080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-546v4\" (UniqueName: \"kubernetes.io/projected/dc8e5144-a514-40ce-8e9e-c4b13ea6d7af-kube-api-access-546v4\") pod \"node-ca-pkd9h\" (UID: \"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af\") " pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:10.104393 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.104362 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4fgn\" (UniqueName: \"kubernetes.io/projected/6839f3b4-61f8-4813-988a-8f23df71a050-kube-api-access-j4fgn\") pod \"multus-7xh5k\" (UID: \"6839f3b4-61f8-4813-988a-8f23df71a050\") " pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.177371 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.177334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:10.186071 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.186046 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" Apr 16 16:29:10.196722 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.196701 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-26g6g" Apr 16 16:29:10.201272 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.201250 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" Apr 16 16:29:10.207919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.207899 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" Apr 16 16:29:10.213522 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.213503 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:10.220092 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.220074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pkd9h" Apr 16 16:29:10.224721 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.224702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7xh5k" Apr 16 16:29:10.495092 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.495012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:10.495242 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.495221 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:10.495309 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.495290 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs podName:bb3c2920-54bb-42cd-a7e2-0c22345e99b4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:11.495272675 +0000 UTC m=+4.079341405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs") pod "network-metrics-daemon-vrlk5" (UID: "bb3c2920-54bb-42cd-a7e2-0c22345e99b4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:10.659738 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.659713 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6839f3b4_61f8_4813_988a_8f23df71a050.slice/crio-8cb54bf0e37d80aa2a2d7fff5c6cc5c330680aa26be3ade03f97d292936e033d WatchSource:0}: Error finding container 8cb54bf0e37d80aa2a2d7fff5c6cc5c330680aa26be3ade03f97d292936e033d: Status 404 returned error can't find the container with id 8cb54bf0e37d80aa2a2d7fff5c6cc5c330680aa26be3ade03f97d292936e033d Apr 16 16:29:10.661463 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.661448 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb546d9e_9684_452f_876d_0d945addae4c.slice/crio-a9a64332cd859b524242945cecd03bf62a546b1a36e753a4a1886fed27f71a40 WatchSource:0}: Error finding container a9a64332cd859b524242945cecd03bf62a546b1a36e753a4a1886fed27f71a40: Status 404 returned error can't find the container with id a9a64332cd859b524242945cecd03bf62a546b1a36e753a4a1886fed27f71a40 Apr 16 16:29:10.662396 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.662366 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad1af0f_4c11_46e4_9fcc_8e22841d5e0e.slice/crio-d2c786e32400c394ad38b4f6fc95ace63a350f471eb449777c78d8b425dee740 WatchSource:0}: Error finding container d2c786e32400c394ad38b4f6fc95ace63a350f471eb449777c78d8b425dee740: Status 404 returned error can't find the container with id d2c786e32400c394ad38b4f6fc95ace63a350f471eb449777c78d8b425dee740 Apr 16 16:29:10.664184 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.664156 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b15cd02_f09c_4e86_a518_ab8f11540068.slice/crio-89e7d0fce4161a7eb7a57b2714e5b14ac87e99e4f7129bf1864ec6baf73d5ad4 WatchSource:0}: Error finding container 89e7d0fce4161a7eb7a57b2714e5b14ac87e99e4f7129bf1864ec6baf73d5ad4: Status 404 returned error can't find the container with id 89e7d0fce4161a7eb7a57b2714e5b14ac87e99e4f7129bf1864ec6baf73d5ad4 Apr 16 16:29:10.665499 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.665464 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0049088_3d94_496a_ba74_286b0c64fa8a.slice/crio-b16ee637de5496a022d69adb7aed059450ecd9b1f16971a197eee86518c24180 WatchSource:0}: Error finding container b16ee637de5496a022d69adb7aed059450ecd9b1f16971a197eee86518c24180: Status 404 returned error can't find the container with id b16ee637de5496a022d69adb7aed059450ecd9b1f16971a197eee86518c24180 Apr 16 16:29:10.688674 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.688637 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc8e5144_a514_40ce_8e9e_c4b13ea6d7af.slice/crio-827032a1c7c2c2e54e7b058b9a9eeece886bf322ff328f9c6daea4f17773cf69 WatchSource:0}: Error finding container 827032a1c7c2c2e54e7b058b9a9eeece886bf322ff328f9c6daea4f17773cf69: Status 404 returned error can't find the container with id 827032a1c7c2c2e54e7b058b9a9eeece886bf322ff328f9c6daea4f17773cf69 Apr 16 16:29:10.689332 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.689289 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf931d8c4_2e7a_44a0_8e0b_ee392349e62d.slice/crio-b257ae36b8cd79e4eaeb59842d3a42d5930b31eb6326f019dead8278cb84d810 WatchSource:0}: Error finding container b257ae36b8cd79e4eaeb59842d3a42d5930b31eb6326f019dead8278cb84d810: Status 404 returned error can't find the container with id b257ae36b8cd79e4eaeb59842d3a42d5930b31eb6326f019dead8278cb84d810 Apr 16 16:29:10.690550 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:10.690533 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f77f7d_b36d_4041_8328_4e6971055b19.slice/crio-40754bf47ff4744d62dbb25f4f1e8b1e0f62d269e9b91a3ba7d82cb56608c0c7 WatchSource:0}: Error finding container 40754bf47ff4744d62dbb25f4f1e8b1e0f62d269e9b91a3ba7d82cb56608c0c7: Status 404 returned error can't find the container with id 40754bf47ff4744d62dbb25f4f1e8b1e0f62d269e9b91a3ba7d82cb56608c0c7 Apr 16 16:29:10.695932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.695903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:10.696021 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.696010 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:10.696077 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.696033 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:10.696077 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.696043 2573 projected.go:194] Error preparing data for projected volume kube-api-access-df57k for pod openshift-network-diagnostics/network-check-target-xzlm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:10.696147 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.696090 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k podName:6e5b8c51-4e68-4bf7-bb21-ec500c600441 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:11.696076234 +0000 UTC m=+4.280144964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-df57k" (UniqueName: "kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k") pod "network-check-target-xzlm9" (UID: "6e5b8c51-4e68-4bf7-bb21-ec500c600441") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:10.911146 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.911110 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:24:08 +0000 UTC" deadline="2027-11-27 05:52:00.074283475 +0000 UTC" Apr 16 16:29:10.911146 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.911138 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14149h22m49.163147288s" Apr 16 16:29:10.972554 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.972530 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:10.972710 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:10.972631 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:10.979302 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.979264 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pkd9h" event={"ID":"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af","Type":"ContainerStarted","Data":"827032a1c7c2c2e54e7b058b9a9eeece886bf322ff328f9c6daea4f17773cf69"} Apr 16 16:29:10.980803 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.980783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"b16ee637de5496a022d69adb7aed059450ecd9b1f16971a197eee86518c24180"} Apr 16 16:29:10.981626 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.981607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" event={"ID":"2b15cd02-f09c-4e86-a518-ab8f11540068","Type":"ContainerStarted","Data":"89e7d0fce4161a7eb7a57b2714e5b14ac87e99e4f7129bf1864ec6baf73d5ad4"} Apr 16 16:29:10.982503 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.982475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" event={"ID":"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e","Type":"ContainerStarted","Data":"d2c786e32400c394ad38b4f6fc95ace63a350f471eb449777c78d8b425dee740"} Apr 16 16:29:10.983344 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.983325 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xh5k" event={"ID":"6839f3b4-61f8-4813-988a-8f23df71a050","Type":"ContainerStarted","Data":"8cb54bf0e37d80aa2a2d7fff5c6cc5c330680aa26be3ade03f97d292936e033d"} Apr 16 16:29:10.986168 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.986150 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" event={"ID":"f2b918a444bf444231f9bc2675345c3c","Type":"ContainerStarted","Data":"cc93bcc6e89f16f181ca5dcf9f1c6e39dce0af14a640a8235a98688eb93b42ac"} Apr 16 16:29:10.987071 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.987043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerStarted","Data":"40754bf47ff4744d62dbb25f4f1e8b1e0f62d269e9b91a3ba7d82cb56608c0c7"} Apr 16 16:29:10.988003 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.987978 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-26g6g" event={"ID":"db546d9e-9684-452f-876d-0d945addae4c","Type":"ContainerStarted","Data":"a9a64332cd859b524242945cecd03bf62a546b1a36e753a4a1886fed27f71a40"} Apr 16 16:29:10.988854 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.988837 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pgpjj" event={"ID":"f931d8c4-2e7a-44a0-8e0b-ee392349e62d","Type":"ContainerStarted","Data":"b257ae36b8cd79e4eaeb59842d3a42d5930b31eb6326f019dead8278cb84d810"} Apr 16 16:29:10.998977 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:10.998937 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-12.ec2.internal" podStartSLOduration=2.99892233 podStartE2EDuration="2.99892233s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:29:10.998710042 +0000 UTC m=+3.582778792" watchObservedRunningTime="2026-04-16 16:29:10.99892233 +0000 UTC m=+3.582991104" Apr 16 16:29:11.501732 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:11.501697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:11.501876 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:11.501836 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:11.501936 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:11.501898 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs podName:bb3c2920-54bb-42cd-a7e2-0c22345e99b4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:13.5018807 +0000 UTC m=+6.085949447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs") pod "network-metrics-daemon-vrlk5" (UID: "bb3c2920-54bb-42cd-a7e2-0c22345e99b4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:11.702999 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:11.702912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:11.703164 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:11.703107 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:11.703164 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:11.703127 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:11.703164 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:11.703140 2573 projected.go:194] Error preparing data for projected volume kube-api-access-df57k for pod openshift-network-diagnostics/network-check-target-xzlm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:11.703329 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:11.703196 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k podName:6e5b8c51-4e68-4bf7-bb21-ec500c600441 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:13.703179112 +0000 UTC m=+6.287247842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-df57k" (UniqueName: "kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k") pod "network-check-target-xzlm9" (UID: "6e5b8c51-4e68-4bf7-bb21-ec500c600441") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:11.973183 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:11.973103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:11.973621 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:11.973237 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:12.002200 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:12.001046 2573 generic.go:358] "Generic (PLEG): container finished" podID="7fb776f437685f75160db2a241f9a93e" containerID="dc6162dd11932761476716e341dd459efba57365d37d6e085affa76d00b5f045" exitCode=0 Apr 16 16:29:12.002200 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:12.001968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" event={"ID":"7fb776f437685f75160db2a241f9a93e","Type":"ContainerDied","Data":"dc6162dd11932761476716e341dd459efba57365d37d6e085affa76d00b5f045"} Apr 16 16:29:12.973262 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:12.973220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:12.973728 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:12.973370 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:13.007751 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:13.007716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" event={"ID":"7fb776f437685f75160db2a241f9a93e","Type":"ContainerStarted","Data":"04759e6141a195c2de3253f5152d5d73faac1097e85806762e4a337507328125"} Apr 16 16:29:13.020837 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:13.020781 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-12.ec2.internal" podStartSLOduration=5.02076395 podStartE2EDuration="5.02076395s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:29:13.02025767 +0000 UTC m=+5.604326420" watchObservedRunningTime="2026-04-16 16:29:13.02076395 +0000 UTC m=+5.604832701" Apr 16 16:29:13.519845 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:13.519179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:13.519845 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:13.519361 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:13.519845 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:13.519480 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs podName:bb3c2920-54bb-42cd-a7e2-0c22345e99b4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:17.519460574 +0000 UTC m=+10.103529305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs") pod "network-metrics-daemon-vrlk5" (UID: "bb3c2920-54bb-42cd-a7e2-0c22345e99b4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:13.721500 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:13.721460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:13.721683 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:13.721665 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:13.721747 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:13.721688 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:13.721747 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:13.721700 2573 projected.go:194] Error preparing data for projected volume kube-api-access-df57k for pod openshift-network-diagnostics/network-check-target-xzlm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:13.721865 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:13.721757 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k podName:6e5b8c51-4e68-4bf7-bb21-ec500c600441 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:17.721738654 +0000 UTC m=+10.305807405 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-df57k" (UniqueName: "kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k") pod "network-check-target-xzlm9" (UID: "6e5b8c51-4e68-4bf7-bb21-ec500c600441") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:13.972941 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:13.972575 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:13.972941 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:13.972720 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:14.973260 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:14.973223 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:14.973755 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:14.973383 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:15.972511 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:15.972370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:15.972511 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:15.972492 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:16.972868 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:16.972800 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:16.973378 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:16.972951 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:17.552796 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:17.552668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:17.552994 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:17.552857 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:17.552994 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:17.552929 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs podName:bb3c2920-54bb-42cd-a7e2-0c22345e99b4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:25.552909631 +0000 UTC m=+18.136978359 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs") pod "network-metrics-daemon-vrlk5" (UID: "bb3c2920-54bb-42cd-a7e2-0c22345e99b4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:17.754173 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:17.754122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:17.754337 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:17.754320 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:17.754400 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:17.754348 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:17.754400 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:17.754362 2573 projected.go:194] Error preparing data for projected volume kube-api-access-df57k for pod openshift-network-diagnostics/network-check-target-xzlm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:17.754498 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:17.754438 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k podName:6e5b8c51-4e68-4bf7-bb21-ec500c600441 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:25.754416233 +0000 UTC m=+18.338484979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-df57k" (UniqueName: "kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k") pod "network-check-target-xzlm9" (UID: "6e5b8c51-4e68-4bf7-bb21-ec500c600441") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:17.973587 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:17.973502 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:17.974031 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:17.973619 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:18.972432 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:18.972386 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:18.972611 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:18.972525 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:19.972646 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:19.972605 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:19.973074 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:19.972754 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:20.972931 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:20.972891 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:20.973382 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:20.973033 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:21.973304 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:21.973259 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:21.973751 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:21.973380 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:22.972890 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:22.972859 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:22.973064 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:22.972970 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:23.973028 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:23.972993 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:23.973601 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:23.973146 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:24.972746 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:24.972709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:24.972933 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:24.972854 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:25.605309 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:25.605276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:25.605754 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:25.605413 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:25.605754 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:25.605478 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs podName:bb3c2920-54bb-42cd-a7e2-0c22345e99b4 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:41.605456671 +0000 UTC m=+34.189525400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs") pod "network-metrics-daemon-vrlk5" (UID: "bb3c2920-54bb-42cd-a7e2-0c22345e99b4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:29:25.807204 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:25.807165 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:25.807380 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:25.807293 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:29:25.807380 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:25.807318 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:29:25.807380 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:25.807332 2573 projected.go:194] Error preparing data for projected volume kube-api-access-df57k for pod openshift-network-diagnostics/network-check-target-xzlm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:25.807523 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:25.807398 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k podName:6e5b8c51-4e68-4bf7-bb21-ec500c600441 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:41.807383014 +0000 UTC m=+34.391451743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-df57k" (UniqueName: "kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k") pod "network-check-target-xzlm9" (UID: "6e5b8c51-4e68-4bf7-bb21-ec500c600441") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:29:25.972979 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:25.972908 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:25.973117 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:25.973010 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:26.973295 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:26.973259 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:26.973809 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:26.973388 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:27.972995 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:27.972821 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:27.973114 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:27.973092 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:28.034136 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.034110 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pgpjj" event={"ID":"f931d8c4-2e7a-44a0-8e0b-ee392349e62d","Type":"ContainerStarted","Data":"b7cd11cf7d5bd3a5667f6b6df5d22668f4a42d057767011d752bb085ef492166"} Apr 16 16:29:28.035255 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.035233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pkd9h" event={"ID":"dc8e5144-a514-40ce-8e9e-c4b13ea6d7af","Type":"ContainerStarted","Data":"b0a154020d6b94395467571acdd5f6413e012ecff1e84563af8c60a2b3ff369f"} Apr 16 16:29:28.036629 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.036612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"b012711564ee511ba612b3524dd906be94de64d6aed07c628ef3552d826f5b3c"} Apr 16 16:29:28.036715 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.036633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"a8716b6e26ca5e647285189f0a1191271347bd19f4b7fd7ca6f264b84b265372"} Apr 16 16:29:28.037926 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.037905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" event={"ID":"2b15cd02-f09c-4e86-a518-ab8f11540068","Type":"ContainerStarted","Data":"45dabc9d064312bfa999cf61a1e9f4e10c4b64c37a878bc07e073568055ccb02"} Apr 16 16:29:28.039243 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.039216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" event={"ID":"8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e","Type":"ContainerStarted","Data":"19a93d829b2029e639e33ba6563a8e10eb1be23cd10fe2e4f00901ff6d01b3ab"} Apr 16 16:29:28.040522 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.040501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xh5k" event={"ID":"6839f3b4-61f8-4813-988a-8f23df71a050","Type":"ContainerStarted","Data":"a6ff5be76546ad6df97b1c48962c73992e91d37d1a159b07cfc494359eeddea9"} Apr 16 16:29:28.041701 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.041681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerStarted","Data":"da6469776af98986094e23ee8d4ec60cfd8a37a3d104c0c45937d8a43403ca0b"} Apr 16 16:29:28.048001 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.047956 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pkd9h" podStartSLOduration=3.196263378 podStartE2EDuration="20.047942846s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.693375422 +0000 UTC m=+3.277444156" lastFinishedPulling="2026-04-16 16:29:27.545054894 +0000 UTC m=+20.129123624" observedRunningTime="2026-04-16 16:29:28.04746817 +0000 UTC m=+20.631536918" watchObservedRunningTime="2026-04-16 16:29:28.047942846 +0000 UTC m=+20.632011595" Apr 16 16:29:28.059836 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.059786 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pgpjj" podStartSLOduration=3.228649152 podStartE2EDuration="20.059768479s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.693928912 +0000 UTC m=+3.277997652" lastFinishedPulling="2026-04-16 16:29:27.525048238 +0000 UTC m=+20.109116979" observedRunningTime="2026-04-16 16:29:28.059444418 +0000 UTC m=+20.643513168" watchObservedRunningTime="2026-04-16 16:29:28.059768479 +0000 UTC m=+20.643837230" Apr 16 16:29:28.072736 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.072535 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6t9jb" podStartSLOduration=3.191688266 podStartE2EDuration="20.072520627s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.664223662 +0000 UTC m=+3.248292402" lastFinishedPulling="2026-04-16 16:29:27.545056032 +0000 UTC m=+20.129124763" observedRunningTime="2026-04-16 16:29:28.072030806 +0000 UTC m=+20.656099557" watchObservedRunningTime="2026-04-16 16:29:28.072520627 +0000 UTC m=+20.656589376" Apr 16 16:29:28.107781 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.107724 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7xh5k" podStartSLOduration=3.193235101 podStartE2EDuration="20.107704965s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.661715052 +0000 UTC m=+3.245783779" lastFinishedPulling="2026-04-16 16:29:27.576184915 +0000 UTC m=+20.160253643" observedRunningTime="2026-04-16 16:29:28.107517513 +0000 UTC m=+20.691586261" watchObservedRunningTime="2026-04-16 16:29:28.107704965 +0000 UTC m=+20.691773722" Apr 16 16:29:28.352845 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.352772 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:28.353686 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.353663 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:28.693553 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.693527 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:29:28.949143 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.949001 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:29:28.693549132Z","UUID":"c66ef5d9-074c-46f2-b9a7-865717a49db6","Handler":null,"Name":"","Endpoint":""} Apr 16 16:29:28.950307 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.950289 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:29:28.950307 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.950313 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:29:28.972310 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:28.972290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:28.972476 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:28.972387 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:29.046721 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.046684 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"9fad37c1c6ea10656a85b28ced253d4d1eb24f8850b494f287cbe1694efba3cd"} Apr 16 16:29:29.047305 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.046729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"38b20f99e92a09f8df2219cd1d7d62bbc86bd7a45f141c8a5a0a5bcbb7c6f9d0"} Apr 16 16:29:29.047305 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.046742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"805603d9147376704bd7ccef5eecd197265d3e81bec0e1b0c16947064533bc2e"} Apr 16 16:29:29.047305 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.046753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"3532ac0322d00ea3470e2fb20a3ed0d89b04796e70bdc99361172b1013984223"} Apr 16 16:29:29.048450 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.048425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" event={"ID":"2b15cd02-f09c-4e86-a518-ab8f11540068","Type":"ContainerStarted","Data":"59b580527273331aa69cc2d2c103b8486d223b55481b76c899f66b64c30f7d95"} Apr 16 16:29:29.049891 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.049864 2573 generic.go:358] "Generic (PLEG): container finished" podID="34f77f7d-b36d-4041-8328-4e6971055b19" containerID="da6469776af98986094e23ee8d4ec60cfd8a37a3d104c0c45937d8a43403ca0b" exitCode=0 Apr 16 16:29:29.050000 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.049922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerDied","Data":"da6469776af98986094e23ee8d4ec60cfd8a37a3d104c0c45937d8a43403ca0b"} Apr 16 16:29:29.051393 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.051320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-26g6g" event={"ID":"db546d9e-9684-452f-876d-0d945addae4c","Type":"ContainerStarted","Data":"95fea7cf5055d25850169f2a232a19e30a4ba9fdb33bfd067ea3d109afcf382a"} Apr 16 16:29:29.051682 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.051644 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:29.052218 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.052197 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pgpjj" Apr 16 16:29:29.085277 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.085226 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-26g6g" podStartSLOduration=4.204299659 podStartE2EDuration="21.08521092s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.664192863 +0000 UTC m=+3.248261609" lastFinishedPulling="2026-04-16 16:29:27.545104128 +0000 UTC m=+20.129172870" observedRunningTime="2026-04-16 16:29:29.085083512 +0000 UTC m=+21.669152260" watchObservedRunningTime="2026-04-16 16:29:29.08521092 +0000 UTC m=+21.669279670" Apr 16 16:29:29.972533 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:29.972500 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:29.972771 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:29.972613 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:30.055543 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:30.055502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" event={"ID":"2b15cd02-f09c-4e86-a518-ab8f11540068","Type":"ContainerStarted","Data":"efd405ce8377e6f0df16215a558b2e736933f05fd72e314246548747d95402eb"} Apr 16 16:29:30.078831 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:30.078771 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ccn5k" podStartSLOduration=3.3833905570000002 podStartE2EDuration="22.078753377s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.686906215 +0000 UTC m=+3.270974943" lastFinishedPulling="2026-04-16 16:29:29.382269036 +0000 UTC m=+21.966337763" observedRunningTime="2026-04-16 16:29:30.078502661 +0000 UTC m=+22.662571409" watchObservedRunningTime="2026-04-16 16:29:30.078753377 +0000 UTC m=+22.662822130" Apr 16 16:29:30.973309 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:30.973127 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:30.973481 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:30.973400 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:31.061016 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:31.060972 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"183be52c96d7a0f98b6844286ee43ffe7593199f8010a439ced1b4410b683e06"} Apr 16 16:29:31.972343 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:31.972307 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:31.972583 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:31.972422 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:32.972557 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:32.972521 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:32.973112 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:32.972661 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:33.972909 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:33.972733 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:33.973521 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:33.972981 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:34.069847 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:34.069809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" event={"ID":"f0049088-3d94-496a-ba74-286b0c64fa8a","Type":"ContainerStarted","Data":"f786f2bb5800b0422e0279db867dadeb7ae5219a20fd2def11aefb184847e991"} Apr 16 16:29:34.070137 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:34.070112 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:34.071411 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:34.071387 2573 generic.go:358] "Generic (PLEG): container finished" podID="34f77f7d-b36d-4041-8328-4e6971055b19" containerID="766feff0fbf99117710d91624d11a46eefefc5639a589b70a21f27a86d62f2e5" exitCode=0 Apr 16 16:29:34.071501 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:34.071426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerDied","Data":"766feff0fbf99117710d91624d11a46eefefc5639a589b70a21f27a86d62f2e5"} Apr 16 16:29:34.084795 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:34.084774 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:34.100198 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:34.100151 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" podStartSLOduration=8.870364147 podStartE2EDuration="26.100140321s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.686871136 +0000 UTC m=+3.270939863" lastFinishedPulling="2026-04-16 16:29:27.916647306 +0000 UTC m=+20.500716037" observedRunningTime="2026-04-16 16:29:34.099908984 +0000 UTC m=+26.683977732" watchObservedRunningTime="2026-04-16 16:29:34.100140321 +0000 UTC m=+26.684209069" Apr 16 16:29:34.972770 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:34.972741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:34.973037 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:34.972846 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:35.074167 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:35.074142 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:35.074288 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:35.074175 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:35.088166 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:35.088147 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:29:35.431492 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:35.431461 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xzlm9"] Apr 16 16:29:35.431696 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:35.431575 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:35.431696 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:35.431679 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:35.434401 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:35.434379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vrlk5"] Apr 16 16:29:35.434498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:35.434478 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:35.434573 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:35.434555 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:36.077381 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:36.077352 2573 generic.go:358] "Generic (PLEG): container finished" podID="34f77f7d-b36d-4041-8328-4e6971055b19" containerID="de9f5a0b3b4739472fec6832df92f8d72073b798572aa30ada975ce220d99243" exitCode=0 Apr 16 16:29:36.077757 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:36.077441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerDied","Data":"de9f5a0b3b4739472fec6832df92f8d72073b798572aa30ada975ce220d99243"} Apr 16 16:29:36.972402 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:36.972370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:36.972529 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:36.972370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:36.972529 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:36.972493 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:36.972618 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:36.972604 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:37.080507 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:37.080478 2573 generic.go:358] "Generic (PLEG): container finished" podID="34f77f7d-b36d-4041-8328-4e6971055b19" containerID="ac228a52f595f11c5913bdf82d2fa307176bf3a157731e2066b0dea09bed41b6" exitCode=0 Apr 16 16:29:37.080976 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:37.080562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerDied","Data":"ac228a52f595f11c5913bdf82d2fa307176bf3a157731e2066b0dea09bed41b6"} Apr 16 16:29:38.202083 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.201818 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-78zgg"] Apr 16 16:29:38.203886 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.203853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.206282 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.206260 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:29:38.206282 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.206281 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:29:38.207365 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.206869 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wpjpl\"" Apr 16 16:29:38.293163 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.293130 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2387338-f572-4af1-a326-edcc55b8077f-tmp-dir\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.293294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.293181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6q9j\" (UniqueName: \"kubernetes.io/projected/a2387338-f572-4af1-a326-edcc55b8077f-kube-api-access-v6q9j\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.293294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.293226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2387338-f572-4af1-a326-edcc55b8077f-hosts-file\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.394420 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.394382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2387338-f572-4af1-a326-edcc55b8077f-hosts-file\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.394575 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.394467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2387338-f572-4af1-a326-edcc55b8077f-tmp-dir\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.394575 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.394494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6q9j\" (UniqueName: \"kubernetes.io/projected/a2387338-f572-4af1-a326-edcc55b8077f-kube-api-access-v6q9j\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.394575 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.394502 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2387338-f572-4af1-a326-edcc55b8077f-hosts-file\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.394851 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.394826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2387338-f572-4af1-a326-edcc55b8077f-tmp-dir\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.403954 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.403932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6q9j\" (UniqueName: \"kubernetes.io/projected/a2387338-f572-4af1-a326-edcc55b8077f-kube-api-access-v6q9j\") pod \"node-resolver-78zgg\" (UID: \"a2387338-f572-4af1-a326-edcc55b8077f\") " pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.516299 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.516227 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-78zgg" Apr 16 16:29:38.973033 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.973000 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:38.973213 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:38.973002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:38.973213 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:38.973142 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrlk5" podUID="bb3c2920-54bb-42cd-a7e2-0c22345e99b4" Apr 16 16:29:38.973335 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:38.973223 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzlm9" podUID="6e5b8c51-4e68-4bf7-bb21-ec500c600441" Apr 16 16:29:39.086553 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:39.086517 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-78zgg" event={"ID":"a2387338-f572-4af1-a326-edcc55b8077f","Type":"ContainerStarted","Data":"7c8c428f4eb773cf204b0546dc13cc2e223625c1b641094ffad89057f67ccd91"} Apr 16 16:29:39.086728 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:39.086561 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-78zgg" event={"ID":"a2387338-f572-4af1-a326-edcc55b8077f","Type":"ContainerStarted","Data":"50185a766a39c3686cc94e32d8b70eb7fbc0098f022def9881f6798dd0d601cf"} Apr 16 16:29:40.727156 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.727128 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-12.ec2.internal" event="NodeReady" Apr 16 16:29:40.728136 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.727267 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:29:40.764251 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.764185 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-78zgg" podStartSLOduration=2.764170385 podStartE2EDuration="2.764170385s" podCreationTimestamp="2026-04-16 16:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:29:39.09982286 +0000 UTC m=+31.683891610" watchObservedRunningTime="2026-04-16 16:29:40.764170385 +0000 UTC m=+33.348239134" Apr 16 16:29:40.764531 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.764509 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-65c845f5f8-7c7zn"] Apr 16 16:29:40.792374 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.792350 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr"] Apr 16 16:29:40.792529 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.792499 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.795663 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.795628 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:29:40.795787 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.795637 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:29:40.795787 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.795699 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fdzgl\"" Apr 16 16:29:40.795973 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.795956 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:29:40.801112 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.800926 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:29:40.813535 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.813513 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp"] Apr 16 16:29:40.813707 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.813691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:40.816630 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.816608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 16:29:40.817167 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.817149 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 16:29:40.817250 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.817176 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 16:29:40.817250 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.817177 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:29:40.817526 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.817506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-fsq4l\"" Apr 16 16:29:40.828093 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.828073 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz"] Apr 16 16:29:40.828256 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.828240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:40.830594 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.830578 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:29:40.831041 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.830863 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-rq8nm\"" Apr 16 16:29:40.831041 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.830929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:29:40.831041 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.830957 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:29:40.831041 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.830984 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:29:40.840928 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.840909 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq"] Apr 16 16:29:40.841067 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.841052 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:40.844136 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.844116 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:29:40.844251 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.844136 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:29:40.844313 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.844294 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hf7xw\"" Apr 16 16:29:40.844528 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.844511 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:29:40.844747 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.844730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:29:40.853256 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.853235 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz"] Apr 16 16:29:40.853392 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.853378 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:40.856191 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.856171 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 16:29:40.856191 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.856185 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 16:29:40.856345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.856241 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fczp5\"" Apr 16 16:29:40.865775 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.865758 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf"] Apr 16 16:29:40.865913 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.865900 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" Apr 16 16:29:40.871569 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.871550 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:29:40.872625 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.872608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-kwk9c\"" Apr 16 16:29:40.873447 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.873430 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:29:40.877425 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.877318 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln"] Apr 16 16:29:40.877491 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.877457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:40.880538 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.880522 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:29:40.889518 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889503 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65c845f5f8-7c7zn"] Apr 16 16:29:40.889616 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889523 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz"] Apr 16 16:29:40.889616 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889533 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp"] Apr 16 16:29:40.889616 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889542 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq"] Apr 16 16:29:40.889616 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889553 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz"] Apr 16 16:29:40.889616 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889567 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf"] Apr 16 16:29:40.889616 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889578 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln"] Apr 16 16:29:40.889616 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889592 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr"] Apr 16 16:29:40.889955 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.889664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:40.892432 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.892412 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:29:40.892546 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.892456 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:29:40.892747 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.892727 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:29:40.902563 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.902545 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:29:40.912393 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912370 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-certificates\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912469 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-image-registry-private-configuration\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912469 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-bound-sa-token\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912549 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbvv9\" (UniqueName: \"kubernetes.io/projected/91b28333-7ad2-40bf-b991-a75717a9ead4-kube-api-access-gbvv9\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:40.912549 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38423965-6637-4f51-89af-e465c2846faf-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:40.912549 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912536 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912692 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-installation-pull-secrets\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912692 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r78n\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-kube-api-access-7r78n\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912692 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912680 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxxgr\" (UniqueName: \"kubernetes.io/projected/38423965-6637-4f51-89af-e465c2846faf-kube-api-access-zxxgr\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:40.912806 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b28333-7ad2-40bf-b991-a75717a9ead4-config\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:40.912806 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38423965-6637-4f51-89af-e465c2846faf-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:40.912806 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4917fbec-85cc-4edb-b09c-7f4a6a5e83f4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764fd9845c-g5ggz\" (UID: \"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:40.912806 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-trusted-ca\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912980 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912812 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvxst\" (UniqueName: \"kubernetes.io/projected/4917fbec-85cc-4edb-b09c-7f4a6a5e83f4-kube-api-access-dvxst\") pod \"managed-serviceaccount-addon-agent-764fd9845c-g5ggz\" (UID: \"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:40.912980 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f60a8ab-e833-4e10-8ef6-e02666b7e615-ca-trust-extracted\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:40.912980 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.912866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91b28333-7ad2-40bf-b991-a75717a9ead4-serving-cert\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:40.972843 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.972812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:40.973018 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.972849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:40.975533 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.975500 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:29:40.975674 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.975540 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ws7rw\"" Apr 16 16:29:40.975782 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:40.975767 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqp8k\"" Apr 16 16:29:41.013391 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.013391 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-installation-pull-secrets\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.013570 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r78n\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-kube-api-access-7r78n\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.013570 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:41.013570 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxxgr\" (UniqueName: \"kubernetes.io/projected/38423965-6637-4f51-89af-e465c2846faf-kube-api-access-zxxgr\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:41.013570 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.013491 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:29:41.013570 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-hub\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.013570 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b28333-7ad2-40bf-b991-a75717a9ead4-config\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:41.013570 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38423965-6637-4f51-89af-e465c2846faf-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013583 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75v7d\" (UniqueName: \"kubernetes.io/projected/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-kube-api-access-75v7d\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/160b9aaa-8178-4147-a5f7-c886dd47054c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4917fbec-85cc-4edb-b09c-7f4a6a5e83f4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764fd9845c-g5ggz\" (UID: \"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.013509 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65c845f5f8-7c7zn: secret "image-registry-tls" not found Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-trusted-ca\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.013769 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls podName:0f60a8ab-e833-4e10-8ef6-e02666b7e615 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:41.513746907 +0000 UTC m=+34.097815637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls") pod "image-registry-65c845f5f8-7c7zn" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615") : secret "image-registry-tls" not found Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.013919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.013893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh882\" (UniqueName: \"kubernetes.io/projected/160b9aaa-8178-4147-a5f7-c886dd47054c-kube-api-access-mh882\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.014294 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvxst\" (UniqueName: \"kubernetes.io/projected/4917fbec-85cc-4edb-b09c-7f4a6a5e83f4-kube-api-access-dvxst\") pod \"managed-serviceaccount-addon-agent-764fd9845c-g5ggz\" (UID: \"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:41.014365 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-tmp\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.014365 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014321 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.014365 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f60a8ab-e833-4e10-8ef6-e02666b7e615-ca-trust-extracted\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.014365 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91b28333-7ad2-40bf-b991-a75717a9ead4-serving-cert\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-certificates\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b28333-7ad2-40bf-b991-a75717a9ead4-config\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-image-registry-private-configuration\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-bound-sa-token\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/deb6ad86-785f-4724-a0f9-ecc5341ee981-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbvv9\" (UniqueName: \"kubernetes.io/projected/91b28333-7ad2-40bf-b991-a75717a9ead4-kube-api-access-gbvv9\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38423965-6637-4f51-89af-e465c2846faf-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:41.014498 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-ca\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.014845 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014548 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.014845 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014587 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38423965-6637-4f51-89af-e465c2846faf-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:41.014845 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67lw\" (UniqueName: \"kubernetes.io/projected/fb9f58f2-5c1e-437f-8ea7-a75f424536f4-kube-api-access-x67lw\") pod \"network-check-source-7b678d77c7-zrwnz\" (UID: \"fb9f58f2-5c1e-437f-8ea7-a75f424536f4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" Apr 16 16:29:41.014845 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-trusted-ca\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.015015 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-certificates\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.015015 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.014951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f60a8ab-e833-4e10-8ef6-e02666b7e615-ca-trust-extracted\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.017332 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.017313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91b28333-7ad2-40bf-b991-a75717a9ead4-serving-cert\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:41.017430 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.017409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38423965-6637-4f51-89af-e465c2846faf-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:41.027775 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.027750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-installation-pull-secrets\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.027971 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.027949 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-image-registry-private-configuration\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.027971 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.027963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4917fbec-85cc-4edb-b09c-7f4a6a5e83f4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764fd9845c-g5ggz\" (UID: \"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:41.028188 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.028166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r78n\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-kube-api-access-7r78n\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.028485 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.028463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxxgr\" (UniqueName: \"kubernetes.io/projected/38423965-6637-4f51-89af-e465c2846faf-kube-api-access-zxxgr\") pod \"kube-storage-version-migrator-operator-756bb7d76f-b8wtr\" (UID: \"38423965-6637-4f51-89af-e465c2846faf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:41.028959 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.028942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvxst\" (UniqueName: \"kubernetes.io/projected/4917fbec-85cc-4edb-b09c-7f4a6a5e83f4-kube-api-access-dvxst\") pod \"managed-serviceaccount-addon-agent-764fd9845c-g5ggz\" (UID: \"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:41.031251 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.031228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-bound-sa-token\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.033070 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.033047 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q86dl"] Apr 16 16:29:41.035772 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.035751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbvv9\" (UniqueName: \"kubernetes.io/projected/91b28333-7ad2-40bf-b991-a75717a9ead4-kube-api-access-gbvv9\") pod \"service-ca-operator-69965bb79d-xx6cp\" (UID: \"91b28333-7ad2-40bf-b991-a75717a9ead4\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:41.067929 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.067899 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q86dl"] Apr 16 16:29:41.067929 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.067932 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wgk4m"] Apr 16 16:29:41.068119 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.068071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.070623 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.070599 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:29:41.086678 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.086646 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgk4m"] Apr 16 16:29:41.086786 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.086768 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.089357 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.089338 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:29:41.089468 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.089372 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:29:41.089468 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.089372 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbbv\"" Apr 16 16:29:41.104078 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.104054 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-btkpq"] Apr 16 16:29:41.115376 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115355 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/deb6ad86-785f-4724-a0f9-ecc5341ee981-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:41.115479 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-ca\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.115479 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.115479 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x67lw\" (UniqueName: \"kubernetes.io/projected/fb9f58f2-5c1e-437f-8ea7-a75f424536f4-kube-api-access-x67lw\") pod \"network-check-source-7b678d77c7-zrwnz\" (UID: \"fb9f58f2-5c1e-437f-8ea7-a75f424536f4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" Apr 16 16:29:41.115634 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:41.115634 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-hub\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.115634 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75v7d\" (UniqueName: \"kubernetes.io/projected/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-kube-api-access-75v7d\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.115634 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/160b9aaa-8178-4147-a5f7-c886dd47054c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.115634 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.115613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.116110 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/deb6ad86-785f-4724-a0f9-ecc5341ee981-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.116211 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.116287 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert podName:deb6ad86-785f-4724-a0f9-ecc5341ee981 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:41.616269534 +0000 UTC m=+34.200338269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-s8lmq" (UID: "deb6ad86-785f-4724-a0f9-ecc5341ee981") : secret "networking-console-plugin-cert" not found Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.116363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh882\" (UniqueName: \"kubernetes.io/projected/160b9aaa-8178-4147-a5f7-c886dd47054c-kube-api-access-mh882\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.116439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-tmp\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.116467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.116559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/160b9aaa-8178-4147-a5f7-c886dd47054c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.116962 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.116915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-tmp\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.118889 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.118865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-hub\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.119208 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.119170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-ca\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.119417 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.119348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.119499 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.119484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.119574 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.119553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/160b9aaa-8178-4147-a5f7-c886dd47054c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.122760 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.122418 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" Apr 16 16:29:41.124999 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.124972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75v7d\" (UniqueName: \"kubernetes.io/projected/a75aa4dc-6a56-436d-b8cb-ef570834bf3e-kube-api-access-75v7d\") pod \"klusterlet-addon-workmgr-7bc4f964d6-m5gjf\" (UID: \"a75aa4dc-6a56-436d-b8cb-ef570834bf3e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.125239 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.125212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh882\" (UniqueName: \"kubernetes.io/projected/160b9aaa-8178-4147-a5f7-c886dd47054c-kube-api-access-mh882\") pod \"cluster-proxy-proxy-agent-7c6fd87b8-mqjln\" (UID: \"160b9aaa-8178-4147-a5f7-c886dd47054c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.125348 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.125279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67lw\" (UniqueName: \"kubernetes.io/projected/fb9f58f2-5c1e-437f-8ea7-a75f424536f4-kube-api-access-x67lw\") pod \"network-check-source-7b678d77c7-zrwnz\" (UID: \"fb9f58f2-5c1e-437f-8ea7-a75f424536f4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" Apr 16 16:29:41.128142 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.128121 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-btkpq"] Apr 16 16:29:41.128253 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.128241 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:41.130855 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.130836 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:29:41.130855 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.130849 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:29:41.131006 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.130880 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:29:41.131237 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.131217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mk2ls\"" Apr 16 16:29:41.136639 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.136622 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" Apr 16 16:29:41.165521 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.165501 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" Apr 16 16:29:41.180270 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.180249 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" Apr 16 16:29:41.188898 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.188872 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:41.198501 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.198481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:29:41.217257 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:41.217368 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-kubelet-config\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.217434 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217377 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltb7r\" (UniqueName: \"kubernetes.io/projected/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-kube-api-access-ltb7r\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.217485 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-original-pull-secret\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.217485 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217461 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-dbus\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.217567 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-tmp-dir\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.217567 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.217693 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-config-volume\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.217751 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.217714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4m79\" (UniqueName: \"kubernetes.io/projected/8ba73826-84fd-4301-9663-d9ff7925faa6-kube-api-access-j4m79\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:41.318959 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.318870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltb7r\" (UniqueName: \"kubernetes.io/projected/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-kube-api-access-ltb7r\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.319113 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.318975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-original-pull-secret\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.319113 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-dbus\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.319113 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-tmp-dir\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.319113 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-config-volume\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4m79\" (UniqueName: \"kubernetes.io/projected/8ba73826-84fd-4301-9663-d9ff7925faa6-kube-api-access-j4m79\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.319184 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-kubelet-config\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.319249 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls podName:b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb nodeName:}" failed. No retries permitted until 2026-04-16 16:29:41.819228001 +0000 UTC m=+34.403296728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls") pod "dns-default-wgk4m" (UID: "b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb") : secret "dns-default-metrics-tls" not found Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-kubelet-config\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.319333 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-dbus\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.319766 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.319369 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:29:41.319766 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.319427 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert podName:8ba73826-84fd-4301-9663-d9ff7925faa6 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:41.819411417 +0000 UTC m=+34.403480158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert") pod "ingress-canary-btkpq" (UID: "8ba73826-84fd-4301-9663-d9ff7925faa6") : secret "canary-serving-cert" not found Apr 16 16:29:41.319766 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-tmp-dir\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.319766 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.319668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-config-volume\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.331724 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.331696 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791-original-pull-secret\") pod \"global-pull-secret-syncer-q86dl\" (UID: \"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791\") " pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.332023 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.332006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltb7r\" (UniqueName: \"kubernetes.io/projected/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-kube-api-access-ltb7r\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.332259 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.332228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4m79\" (UniqueName: \"kubernetes.io/projected/8ba73826-84fd-4301-9663-d9ff7925faa6-kube-api-access-j4m79\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:41.378732 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.378708 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q86dl" Apr 16 16:29:41.520385 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.520347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:41.520545 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.520514 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:29:41.520545 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.520535 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65c845f5f8-7c7zn: secret "image-registry-tls" not found Apr 16 16:29:41.520626 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.520603 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls podName:0f60a8ab-e833-4e10-8ef6-e02666b7e615 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:42.520581462 +0000 UTC m=+35.104650209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls") pod "image-registry-65c845f5f8-7c7zn" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615") : secret "image-registry-tls" not found Apr 16 16:29:41.621352 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.621270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:41.621514 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.621359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:29:41.621514 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.621409 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:29:41.621514 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.621445 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:29:41.621514 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.621490 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert podName:deb6ad86-785f-4724-a0f9-ecc5341ee981 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:42.621472228 +0000 UTC m=+35.205540958 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-s8lmq" (UID: "deb6ad86-785f-4724-a0f9-ecc5341ee981") : secret "networking-console-plugin-cert" not found Apr 16 16:29:41.621514 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.621508 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs podName:bb3c2920-54bb-42cd-a7e2-0c22345e99b4 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:13.621500102 +0000 UTC m=+66.205568834 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs") pod "network-metrics-daemon-vrlk5" (UID: "bb3c2920-54bb-42cd-a7e2-0c22345e99b4") : secret "metrics-daemon-secret" not found Apr 16 16:29:41.823684 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.823641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:41.824121 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.823707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:41.824121 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.823782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:41.824121 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.823813 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:29:41.824121 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.823894 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert podName:8ba73826-84fd-4301-9663-d9ff7925faa6 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:42.823873366 +0000 UTC m=+35.407942099 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert") pod "ingress-canary-btkpq" (UID: "8ba73826-84fd-4301-9663-d9ff7925faa6") : secret "canary-serving-cert" not found Apr 16 16:29:41.824121 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.823895 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:29:41.824121 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:41.823945 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls podName:b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb nodeName:}" failed. No retries permitted until 2026-04-16 16:29:42.823929622 +0000 UTC m=+35.407998370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls") pod "dns-default-wgk4m" (UID: "b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb") : secret "dns-default-metrics-tls" not found Apr 16 16:29:41.826393 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.826370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-df57k\" (UniqueName: \"kubernetes.io/projected/6e5b8c51-4e68-4bf7-bb21-ec500c600441-kube-api-access-df57k\") pod \"network-check-target-xzlm9\" (UID: \"6e5b8c51-4e68-4bf7-bb21-ec500c600441\") " pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:41.900925 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:41.900889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:42.532015 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.531978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:42.532189 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.532143 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:29:42.532189 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.532167 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65c845f5f8-7c7zn: secret "image-registry-tls" not found Apr 16 16:29:42.532294 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.532236 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls podName:0f60a8ab-e833-4e10-8ef6-e02666b7e615 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:44.532211569 +0000 UTC m=+37.116280296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls") pod "image-registry-65c845f5f8-7c7zn" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615") : secret "image-registry-tls" not found Apr 16 16:29:42.632987 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.632955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:42.633164 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.633142 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:29:42.633249 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.633238 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert podName:deb6ad86-785f-4724-a0f9-ecc5341ee981 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:44.633216785 +0000 UTC m=+37.217285526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-s8lmq" (UID: "deb6ad86-785f-4724-a0f9-ecc5341ee981") : secret "networking-console-plugin-cert" not found Apr 16 16:29:42.834615 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.834328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:42.835020 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.834621 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:29:42.835020 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.834676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:42.835020 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.834717 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls podName:b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb nodeName:}" failed. No retries permitted until 2026-04-16 16:29:44.834693398 +0000 UTC m=+37.418762134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls") pod "dns-default-wgk4m" (UID: "b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb") : secret "dns-default-metrics-tls" not found Apr 16 16:29:42.835020 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.834771 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:29:42.835020 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:42.834819 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert podName:8ba73826-84fd-4301-9663-d9ff7925faa6 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:44.834804615 +0000 UTC m=+37.418873346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert") pod "ingress-canary-btkpq" (UID: "8ba73826-84fd-4301-9663-d9ff7925faa6") : secret "canary-serving-cert" not found Apr 16 16:29:42.923965 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.923936 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xzlm9"] Apr 16 16:29:42.936828 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.936786 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz"] Apr 16 16:29:42.941963 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.941917 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln"] Apr 16 16:29:42.948604 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.948583 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp"] Apr 16 16:29:42.951627 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.951597 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5b8c51_4e68_4bf7_bb21_ec500c600441.slice/crio-36e6fd204dc164d355dba02fcb5e68d819bfd1903a713dc5e06f87ce5e3e5d92 WatchSource:0}: Error finding container 36e6fd204dc164d355dba02fcb5e68d819bfd1903a713dc5e06f87ce5e3e5d92: Status 404 returned error can't find the container with id 36e6fd204dc164d355dba02fcb5e68d819bfd1903a713dc5e06f87ce5e3e5d92 Apr 16 16:29:42.953071 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.952357 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4917fbec_85cc_4edb_b09c_7f4a6a5e83f4.slice/crio-c0d34ce8f7b3ccad2c0284bb238cd1deef5af11cb73e1cb185f08ff6d84c217d WatchSource:0}: Error finding container c0d34ce8f7b3ccad2c0284bb238cd1deef5af11cb73e1cb185f08ff6d84c217d: Status 404 returned error can't find the container with id c0d34ce8f7b3ccad2c0284bb238cd1deef5af11cb73e1cb185f08ff6d84c217d Apr 16 16:29:42.954384 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.953765 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q86dl"] Apr 16 16:29:42.955003 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.954915 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod160b9aaa_8178_4147_a5f7_c886dd47054c.slice/crio-1d62065fb2af03ed7694d37e57b4004ef0a39ad488b554cde1e9137586c3d840 WatchSource:0}: Error finding container 1d62065fb2af03ed7694d37e57b4004ef0a39ad488b554cde1e9137586c3d840: Status 404 returned error can't find the container with id 1d62065fb2af03ed7694d37e57b4004ef0a39ad488b554cde1e9137586c3d840 Apr 16 16:29:42.955193 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.955136 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b28333_7ad2_40bf_b991_a75717a9ead4.slice/crio-f4353b855dc76292356e8b4049dfeafb9ce012a21fc1b34774f6c77f494f42ef WatchSource:0}: Error finding container f4353b855dc76292356e8b4049dfeafb9ce012a21fc1b34774f6c77f494f42ef: Status 404 returned error can't find the container with id f4353b855dc76292356e8b4049dfeafb9ce012a21fc1b34774f6c77f494f42ef Apr 16 16:29:42.957309 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.957286 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8bd6f6a_c52e_44cd_8d08_c4d8f3a1d791.slice/crio-ec2c2a834f07a3d830ad151876d5c08a655a1c88ee174600a95f8aa91ae5f87b WatchSource:0}: Error finding container ec2c2a834f07a3d830ad151876d5c08a655a1c88ee174600a95f8aa91ae5f87b: Status 404 returned error can't find the container with id ec2c2a834f07a3d830ad151876d5c08a655a1c88ee174600a95f8aa91ae5f87b Apr 16 16:29:42.962976 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.962955 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz"] Apr 16 16:29:42.965846 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.965825 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf"] Apr 16 16:29:42.966641 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:42.966594 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr"] Apr 16 16:29:42.975690 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.975436 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9f58f2_5c1e_437f_8ea7_a75f424536f4.slice/crio-7170a95d94968cf65f4c36bd887938c0dfdad8fba2ab84876bf7bd4fb76c2f6a WatchSource:0}: Error finding container 7170a95d94968cf65f4c36bd887938c0dfdad8fba2ab84876bf7bd4fb76c2f6a: Status 404 returned error can't find the container with id 7170a95d94968cf65f4c36bd887938c0dfdad8fba2ab84876bf7bd4fb76c2f6a Apr 16 16:29:42.976260 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.976014 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75aa4dc_6a56_436d_b8cb_ef570834bf3e.slice/crio-a0f7ededea99cc5d322bff63d15a82e04dc1f118a5d563201718f476926ff79b WatchSource:0}: Error finding container a0f7ededea99cc5d322bff63d15a82e04dc1f118a5d563201718f476926ff79b: Status 404 returned error can't find the container with id a0f7ededea99cc5d322bff63d15a82e04dc1f118a5d563201718f476926ff79b Apr 16 16:29:42.977210 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:42.977088 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38423965_6637_4f51_89af_e465c2846faf.slice/crio-dc7c071c7520d518ec9a2f259d8eabffdb07e9f29d183db2c8553aff6b210906 WatchSource:0}: Error finding container dc7c071c7520d518ec9a2f259d8eabffdb07e9f29d183db2c8553aff6b210906: Status 404 returned error can't find the container with id dc7c071c7520d518ec9a2f259d8eabffdb07e9f29d183db2c8553aff6b210906 Apr 16 16:29:43.096237 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.096195 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" event={"ID":"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4","Type":"ContainerStarted","Data":"c0d34ce8f7b3ccad2c0284bb238cd1deef5af11cb73e1cb185f08ff6d84c217d"} Apr 16 16:29:43.097190 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.097168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" event={"ID":"a75aa4dc-6a56-436d-b8cb-ef570834bf3e","Type":"ContainerStarted","Data":"a0f7ededea99cc5d322bff63d15a82e04dc1f118a5d563201718f476926ff79b"} Apr 16 16:29:43.098154 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.098134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" event={"ID":"fb9f58f2-5c1e-437f-8ea7-a75f424536f4","Type":"ContainerStarted","Data":"7170a95d94968cf65f4c36bd887938c0dfdad8fba2ab84876bf7bd4fb76c2f6a"} Apr 16 16:29:43.099010 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.098992 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" event={"ID":"160b9aaa-8178-4147-a5f7-c886dd47054c","Type":"ContainerStarted","Data":"1d62065fb2af03ed7694d37e57b4004ef0a39ad488b554cde1e9137586c3d840"} Apr 16 16:29:43.099998 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.099975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" event={"ID":"38423965-6637-4f51-89af-e465c2846faf","Type":"ContainerStarted","Data":"dc7c071c7520d518ec9a2f259d8eabffdb07e9f29d183db2c8553aff6b210906"} Apr 16 16:29:43.104670 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.104626 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" event={"ID":"91b28333-7ad2-40bf-b991-a75717a9ead4","Type":"ContainerStarted","Data":"f4353b855dc76292356e8b4049dfeafb9ce012a21fc1b34774f6c77f494f42ef"} Apr 16 16:29:43.105629 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.105608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q86dl" event={"ID":"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791","Type":"ContainerStarted","Data":"ec2c2a834f07a3d830ad151876d5c08a655a1c88ee174600a95f8aa91ae5f87b"} Apr 16 16:29:43.106502 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:43.106484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xzlm9" event={"ID":"6e5b8c51-4e68-4bf7-bb21-ec500c600441","Type":"ContainerStarted","Data":"36e6fd204dc164d355dba02fcb5e68d819bfd1903a713dc5e06f87ce5e3e5d92"} Apr 16 16:29:44.124852 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:44.123817 2573 generic.go:358] "Generic (PLEG): container finished" podID="34f77f7d-b36d-4041-8328-4e6971055b19" containerID="6cda05365c5e9dc235317bf85b33a65ee58f04186ef016e457a06afccbc8d1b6" exitCode=0 Apr 16 16:29:44.124852 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:44.123901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerDied","Data":"6cda05365c5e9dc235317bf85b33a65ee58f04186ef016e457a06afccbc8d1b6"} Apr 16 16:29:44.555422 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:44.554573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:44.555422 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.555012 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:29:44.555422 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.555029 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65c845f5f8-7c7zn: secret "image-registry-tls" not found Apr 16 16:29:44.555422 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.555083 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls podName:0f60a8ab-e833-4e10-8ef6-e02666b7e615 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:48.55506484 +0000 UTC m=+41.139133582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls") pod "image-registry-65c845f5f8-7c7zn" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615") : secret "image-registry-tls" not found Apr 16 16:29:44.656485 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:44.655792 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:44.656485 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.655976 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:29:44.656485 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.656050 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert podName:deb6ad86-785f-4724-a0f9-ecc5341ee981 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:48.656030516 +0000 UTC m=+41.240099258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-s8lmq" (UID: "deb6ad86-785f-4724-a0f9-ecc5341ee981") : secret "networking-console-plugin-cert" not found Apr 16 16:29:44.858624 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:44.857802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:44.858624 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:44.857877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:44.858624 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.858071 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:29:44.858624 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.858129 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert podName:8ba73826-84fd-4301-9663-d9ff7925faa6 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:48.858111376 +0000 UTC m=+41.442180109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert") pod "ingress-canary-btkpq" (UID: "8ba73826-84fd-4301-9663-d9ff7925faa6") : secret "canary-serving-cert" not found Apr 16 16:29:44.858624 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.858539 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:29:44.858624 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:44.858586 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls podName:b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb nodeName:}" failed. No retries permitted until 2026-04-16 16:29:48.8585718 +0000 UTC m=+41.442640532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls") pod "dns-default-wgk4m" (UID: "b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb") : secret "dns-default-metrics-tls" not found Apr 16 16:29:45.172586 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:45.171688 2573 generic.go:358] "Generic (PLEG): container finished" podID="34f77f7d-b36d-4041-8328-4e6971055b19" containerID="8a050097d42c027d05e264c99fcd72fd3e3a7c031ac220bc4fb51be0a1161db5" exitCode=0 Apr 16 16:29:45.172586 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:45.171762 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerDied","Data":"8a050097d42c027d05e264c99fcd72fd3e3a7c031ac220bc4fb51be0a1161db5"} Apr 16 16:29:46.192677 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:46.191862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" event={"ID":"34f77f7d-b36d-4041-8328-4e6971055b19","Type":"ContainerStarted","Data":"57ab70f36c19b0a7755d41c6aa3f19e12565d3a8b866b6d3929780c5772d5535"} Apr 16 16:29:46.216336 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:46.215545 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7vvmn" podStartSLOduration=5.907917606 podStartE2EDuration="38.215528568s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:10.693906929 +0000 UTC m=+3.277975659" lastFinishedPulling="2026-04-16 16:29:43.001517883 +0000 UTC m=+35.585586621" observedRunningTime="2026-04-16 16:29:46.213729239 +0000 UTC m=+38.797797989" watchObservedRunningTime="2026-04-16 16:29:46.215528568 +0000 UTC m=+38.799597317" Apr 16 16:29:48.594547 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:48.594506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:48.594942 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.594670 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:29:48.594942 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.594695 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65c845f5f8-7c7zn: secret "image-registry-tls" not found Apr 16 16:29:48.594942 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.594762 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls podName:0f60a8ab-e833-4e10-8ef6-e02666b7e615 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:56.59474253 +0000 UTC m=+49.178811270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls") pod "image-registry-65c845f5f8-7c7zn" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615") : secret "image-registry-tls" not found Apr 16 16:29:48.695768 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:48.695735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:48.695925 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.695857 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:29:48.695925 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.695917 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert podName:deb6ad86-785f-4724-a0f9-ecc5341ee981 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:56.695903103 +0000 UTC m=+49.279971830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-s8lmq" (UID: "deb6ad86-785f-4724-a0f9-ecc5341ee981") : secret "networking-console-plugin-cert" not found Apr 16 16:29:48.896870 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:48.896828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:48.897038 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:48.896895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:48.897038 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.896996 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:29:48.897162 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.897082 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls podName:b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb nodeName:}" failed. No retries permitted until 2026-04-16 16:29:56.897059225 +0000 UTC m=+49.481127975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls") pod "dns-default-wgk4m" (UID: "b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb") : secret "dns-default-metrics-tls" not found Apr 16 16:29:48.897162 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.897089 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:29:48.897162 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:48.897144 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert podName:8ba73826-84fd-4301-9663-d9ff7925faa6 nodeName:}" failed. No retries permitted until 2026-04-16 16:29:56.897126699 +0000 UTC m=+49.481195427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert") pod "ingress-canary-btkpq" (UID: "8ba73826-84fd-4301-9663-d9ff7925faa6") : secret "canary-serving-cert" not found Apr 16 16:29:56.217966 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.217927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xzlm9" event={"ID":"6e5b8c51-4e68-4bf7-bb21-ec500c600441","Type":"ContainerStarted","Data":"8adc55aa79f7c9bd6bb3edb6616833afba14c358608b76473113bd5da38b9bc2"} Apr 16 16:29:56.218481 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.218056 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:29:56.219469 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.219432 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" event={"ID":"4917fbec-85cc-4edb-b09c-7f4a6a5e83f4","Type":"ContainerStarted","Data":"9cb6947c1e1c15e83724d05c950c040d95e75db0793e65f1ce9a2d33568e81e4"} Apr 16 16:29:56.220754 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.220736 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" event={"ID":"a75aa4dc-6a56-436d-b8cb-ef570834bf3e","Type":"ContainerStarted","Data":"107d1631fae3c46d0d02c70779b360f36d1f6eb1a1cd5f2ff86298ed64457a1c"} Apr 16 16:29:56.220966 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.220938 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:56.222333 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.222297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" event={"ID":"fb9f58f2-5c1e-437f-8ea7-a75f424536f4","Type":"ContainerStarted","Data":"17be944198915bf98bb417052584ee6d24a287ac431861ec314f5439c82b839b"} Apr 16 16:29:56.222756 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.222737 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" Apr 16 16:29:56.223695 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.223677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" event={"ID":"160b9aaa-8178-4147-a5f7-c886dd47054c","Type":"ContainerStarted","Data":"e8fd533c14436abfe06985edf79c5dbcbc21210a771918e6c20cba41c1ae0adf"} Apr 16 16:29:56.224937 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.224918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" event={"ID":"38423965-6637-4f51-89af-e465c2846faf","Type":"ContainerStarted","Data":"7ecc84381aab7f6f9e15cf79f95672e1ba081c5a6598d463a00a75201b03df04"} Apr 16 16:29:56.226221 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.226201 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" event={"ID":"91b28333-7ad2-40bf-b991-a75717a9ead4","Type":"ContainerStarted","Data":"1ed5f91797e7a918d7a3bb77baf9aa9545d290bcafbb19948843345cfc7bcfee"} Apr 16 16:29:56.227471 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.227452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q86dl" event={"ID":"a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791","Type":"ContainerStarted","Data":"0d493ba14be37b6b7bdd11040277313ce79ca1835cb9b40b4c0836fe9f7a5888"} Apr 16 16:29:56.234104 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.234068 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xzlm9" podStartSLOduration=35.945295592 podStartE2EDuration="48.234055484s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.975098883 +0000 UTC m=+35.559167612" lastFinishedPulling="2026-04-16 16:29:55.263858763 +0000 UTC m=+47.847927504" observedRunningTime="2026-04-16 16:29:56.233071234 +0000 UTC m=+48.817139996" watchObservedRunningTime="2026-04-16 16:29:56.234055484 +0000 UTC m=+48.818124233" Apr 16 16:29:56.250611 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.250571 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zrwnz" podStartSLOduration=33.961677462 podStartE2EDuration="46.250561636s" podCreationTimestamp="2026-04-16 16:29:10 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.977160092 +0000 UTC m=+35.561228832" lastFinishedPulling="2026-04-16 16:29:55.266044265 +0000 UTC m=+47.850113006" observedRunningTime="2026-04-16 16:29:56.249878194 +0000 UTC m=+48.833946944" watchObservedRunningTime="2026-04-16 16:29:56.250561636 +0000 UTC m=+48.834630387" Apr 16 16:29:56.267486 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.267444 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" podStartSLOduration=36.326592295 podStartE2EDuration="48.267430584s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.988281437 +0000 UTC m=+35.572350164" lastFinishedPulling="2026-04-16 16:29:54.929119708 +0000 UTC m=+47.513188453" observedRunningTime="2026-04-16 16:29:56.266991212 +0000 UTC m=+48.851059974" watchObservedRunningTime="2026-04-16 16:29:56.267430584 +0000 UTC m=+48.851499334" Apr 16 16:29:56.296701 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.296642 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" podStartSLOduration=34.007425236 podStartE2EDuration="46.296631313s" podCreationTimestamp="2026-04-16 16:29:10 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.974807004 +0000 UTC m=+35.558875731" lastFinishedPulling="2026-04-16 16:29:55.264013066 +0000 UTC m=+47.848081808" observedRunningTime="2026-04-16 16:29:56.296181991 +0000 UTC m=+48.880250741" watchObservedRunningTime="2026-04-16 16:29:56.296631313 +0000 UTC m=+48.880700040" Apr 16 16:29:56.296832 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.296728 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bc4f964d6-m5gjf" podStartSLOduration=18.036871063 podStartE2EDuration="30.29672371s" podCreationTimestamp="2026-04-16 16:29:26 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.977628869 +0000 UTC m=+35.561697601" lastFinishedPulling="2026-04-16 16:29:55.237481514 +0000 UTC m=+47.821550248" observedRunningTime="2026-04-16 16:29:56.281617585 +0000 UTC m=+48.865686336" watchObservedRunningTime="2026-04-16 16:29:56.29672371 +0000 UTC m=+48.880792459" Apr 16 16:29:56.310791 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.310747 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q86dl" podStartSLOduration=3.021717717 podStartE2EDuration="15.310733185s" podCreationTimestamp="2026-04-16 16:29:41 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.97507953 +0000 UTC m=+35.559148257" lastFinishedPulling="2026-04-16 16:29:55.264094982 +0000 UTC m=+47.848163725" observedRunningTime="2026-04-16 16:29:56.310192621 +0000 UTC m=+48.894261370" watchObservedRunningTime="2026-04-16 16:29:56.310733185 +0000 UTC m=+48.894801935" Apr 16 16:29:56.326179 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.326135 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764fd9845c-g5ggz" podStartSLOduration=18.063458579 podStartE2EDuration="30.326120719s" podCreationTimestamp="2026-04-16 16:29:26 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.974851934 +0000 UTC m=+35.558920661" lastFinishedPulling="2026-04-16 16:29:55.237514062 +0000 UTC m=+47.821582801" observedRunningTime="2026-04-16 16:29:56.324894393 +0000 UTC m=+48.908963143" watchObservedRunningTime="2026-04-16 16:29:56.326120719 +0000 UTC m=+48.910189469" Apr 16 16:29:56.660730 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.660692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:29:56.660902 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.660843 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:29:56.660902 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.660863 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65c845f5f8-7c7zn: secret "image-registry-tls" not found Apr 16 16:29:56.660988 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.660920 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls podName:0f60a8ab-e833-4e10-8ef6-e02666b7e615 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:12.660902185 +0000 UTC m=+65.244970917 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls") pod "image-registry-65c845f5f8-7c7zn" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615") : secret "image-registry-tls" not found Apr 16 16:29:56.761753 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.761716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:29:56.761908 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.761835 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:29:56.761908 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.761889 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert podName:deb6ad86-785f-4724-a0f9-ecc5341ee981 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:12.761874831 +0000 UTC m=+65.345943557 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-s8lmq" (UID: "deb6ad86-785f-4724-a0f9-ecc5341ee981") : secret "networking-console-plugin-cert" not found Apr 16 16:29:56.964368 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.964287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:29:56.964368 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.964342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:29:56.964591 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.964453 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:29:56.964591 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.964510 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert podName:8ba73826-84fd-4301-9663-d9ff7925faa6 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:12.9644963 +0000 UTC m=+65.548565027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert") pod "ingress-canary-btkpq" (UID: "8ba73826-84fd-4301-9663-d9ff7925faa6") : secret "canary-serving-cert" not found Apr 16 16:29:56.964591 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.964453 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:29:56.964591 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:29:56.964578 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls podName:b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb nodeName:}" failed. No retries permitted until 2026-04-16 16:30:12.964566441 +0000 UTC m=+65.548635183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls") pod "dns-default-wgk4m" (UID: "b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb") : secret "dns-default-metrics-tls" not found Apr 16 16:29:56.974488 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.974460 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn"] Apr 16 16:29:56.990127 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.990094 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn"] Apr 16 16:29:56.990283 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.990267 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" Apr 16 16:29:56.993100 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.993073 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-nlxnt\"" Apr 16 16:29:56.994200 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.994177 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 16:29:56.994539 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:56.994516 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 16:29:57.166336 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:57.166296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbnwb\" (UniqueName: \"kubernetes.io/projected/575d205c-5915-427a-881e-800cf69e2195-kube-api-access-pbnwb\") pod \"migrator-64d4d94569-78npn\" (UID: \"575d205c-5915-427a-881e-800cf69e2195\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" Apr 16 16:29:57.267629 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:57.267556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbnwb\" (UniqueName: \"kubernetes.io/projected/575d205c-5915-427a-881e-800cf69e2195-kube-api-access-pbnwb\") pod \"migrator-64d4d94569-78npn\" (UID: \"575d205c-5915-427a-881e-800cf69e2195\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" Apr 16 16:29:57.285693 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:57.285646 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbnwb\" (UniqueName: \"kubernetes.io/projected/575d205c-5915-427a-881e-800cf69e2195-kube-api-access-pbnwb\") pod \"migrator-64d4d94569-78npn\" (UID: \"575d205c-5915-427a-881e-800cf69e2195\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" Apr 16 16:29:57.313867 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:57.313844 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" Apr 16 16:29:57.446110 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:57.446082 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn"] Apr 16 16:29:57.448641 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:57.448613 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod575d205c_5915_427a_881e_800cf69e2195.slice/crio-119159f07e1b8c791947df867aab318ed2262823330aa661de49a7b63a6b8cec WatchSource:0}: Error finding container 119159f07e1b8c791947df867aab318ed2262823330aa661de49a7b63a6b8cec: Status 404 returned error can't find the container with id 119159f07e1b8c791947df867aab318ed2262823330aa661de49a7b63a6b8cec Apr 16 16:29:58.235461 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:58.235418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" event={"ID":"575d205c-5915-427a-881e-800cf69e2195","Type":"ContainerStarted","Data":"119159f07e1b8c791947df867aab318ed2262823330aa661de49a7b63a6b8cec"} Apr 16 16:29:59.016476 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.016437 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kt5wl"] Apr 16 16:29:59.019833 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.019811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.022622 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.022588 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 16:29:59.022622 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.022588 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 16:29:59.022823 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.022645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 16:29:59.023857 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.023821 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-f2f4c\"" Apr 16 16:29:59.024072 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.024050 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 16:29:59.028757 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.027958 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kt5wl"] Apr 16 16:29:59.082956 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.082922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-signing-key\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.083102 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.082961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-signing-cabundle\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.083102 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.083037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz69f\" (UniqueName: \"kubernetes.io/projected/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-kube-api-access-vz69f\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.183468 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.183429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-signing-key\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.183468 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.183466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-signing-cabundle\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.183723 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.183507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz69f\" (UniqueName: \"kubernetes.io/projected/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-kube-api-access-vz69f\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.184378 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.184354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-signing-cabundle\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.186179 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.186150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-signing-key\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.191624 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.191605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz69f\" (UniqueName: \"kubernetes.io/projected/db82f70c-e6a4-48b4-97b1-6a5ee9a7f735-kube-api-access-vz69f\") pod \"service-ca-bfc587fb7-kt5wl\" (UID: \"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735\") " pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.239900 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.239865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" event={"ID":"575d205c-5915-427a-881e-800cf69e2195","Type":"ContainerStarted","Data":"600233fead776b6368dc31c2dcf8cd58553c3877f45772805b12d125d96637d5"} Apr 16 16:29:59.240040 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.239909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" event={"ID":"575d205c-5915-427a-881e-800cf69e2195","Type":"ContainerStarted","Data":"0838f4c39ebc5a7b9903278fbb1457c5ea29200d41e55c0264b225b1167169a7"} Apr 16 16:29:59.257442 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.257398 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-78npn" podStartSLOduration=2.14363873 podStartE2EDuration="3.257384957s" podCreationTimestamp="2026-04-16 16:29:56 +0000 UTC" firstStartedPulling="2026-04-16 16:29:57.450571884 +0000 UTC m=+50.034640616" lastFinishedPulling="2026-04-16 16:29:58.564318113 +0000 UTC m=+51.148386843" observedRunningTime="2026-04-16 16:29:59.256421917 +0000 UTC m=+51.840490677" watchObservedRunningTime="2026-04-16 16:29:59.257384957 +0000 UTC m=+51.841453706" Apr 16 16:29:59.334224 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.334132 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" Apr 16 16:29:59.484868 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:29:59.484845 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-kt5wl"] Apr 16 16:29:59.496786 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:29:59.496758 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb82f70c_e6a4_48b4_97b1_6a5ee9a7f735.slice/crio-e2ef6a1a7f245dd2a5876c23b22046c62a91e9d50dd1edd82bd2e8fa5b10f2ac WatchSource:0}: Error finding container e2ef6a1a7f245dd2a5876c23b22046c62a91e9d50dd1edd82bd2e8fa5b10f2ac: Status 404 returned error can't find the container with id e2ef6a1a7f245dd2a5876c23b22046c62a91e9d50dd1edd82bd2e8fa5b10f2ac Apr 16 16:30:00.244000 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:00.243968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" event={"ID":"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735","Type":"ContainerStarted","Data":"843ed95d0c85c6bd3e822e635554921bdb2f88939d1cbcc67a67e26722f29edb"} Apr 16 16:30:00.244000 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:00.244006 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" event={"ID":"db82f70c-e6a4-48b4-97b1-6a5ee9a7f735","Type":"ContainerStarted","Data":"e2ef6a1a7f245dd2a5876c23b22046c62a91e9d50dd1edd82bd2e8fa5b10f2ac"} Apr 16 16:30:00.245784 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:00.245755 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" event={"ID":"160b9aaa-8178-4147-a5f7-c886dd47054c","Type":"ContainerStarted","Data":"57907be2db0e926b09f9969dfe7b00e493538bc3482ccc703ce6cbc40b8e5378"} Apr 16 16:30:00.245919 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:00.245791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" event={"ID":"160b9aaa-8178-4147-a5f7-c886dd47054c","Type":"ContainerStarted","Data":"0d9cce8cef3ea706ad8ed62ae58db9acf9377048a369b90323ed5b3c5c36c5eb"} Apr 16 16:30:00.259108 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:00.259068 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-kt5wl" podStartSLOduration=2.259056318 podStartE2EDuration="2.259056318s" podCreationTimestamp="2026-04-16 16:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:30:00.258787369 +0000 UTC m=+52.842856118" watchObservedRunningTime="2026-04-16 16:30:00.259056318 +0000 UTC m=+52.843125066" Apr 16 16:30:00.277290 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:00.277254 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" podStartSLOduration=17.820366173 podStartE2EDuration="34.277242918s" podCreationTimestamp="2026-04-16 16:29:26 +0000 UTC" firstStartedPulling="2026-04-16 16:29:42.975014058 +0000 UTC m=+35.559082798" lastFinishedPulling="2026-04-16 16:29:59.431890816 +0000 UTC m=+52.015959543" observedRunningTime="2026-04-16 16:30:00.276265488 +0000 UTC m=+52.860334242" watchObservedRunningTime="2026-04-16 16:30:00.277242918 +0000 UTC m=+52.861311732" Apr 16 16:30:07.091345 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:07.091317 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sg9b4" Apr 16 16:30:11.200070 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:11.200018 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" podUID="160b9aaa-8178-4147-a5f7-c886dd47054c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:30:12.693633 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.693592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:30:12.695957 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.695937 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"image-registry-65c845f5f8-7c7zn\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:30:12.794905 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.794870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:30:12.797086 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.797067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deb6ad86-785f-4724-a0f9-ecc5341ee981-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-s8lmq\" (UID: \"deb6ad86-785f-4724-a0f9-ecc5341ee981\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:30:12.908247 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.908210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fdzgl\"" Apr 16 16:30:12.916376 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.916359 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:30:12.978515 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.978489 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fczp5\"" Apr 16 16:30:12.987373 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.986305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" Apr 16 16:30:12.996528 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.996495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:30:12.996644 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.996561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:30:12.999319 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.999293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb-metrics-tls\") pod \"dns-default-wgk4m\" (UID: \"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb\") " pod="openshift-dns/dns-default-wgk4m" Apr 16 16:30:12.999431 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:12.999377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba73826-84fd-4301-9663-d9ff7925faa6-cert\") pod \"ingress-canary-btkpq\" (UID: \"8ba73826-84fd-4301-9663-d9ff7925faa6\") " pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:30:13.036107 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.036080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65c845f5f8-7c7zn"] Apr 16 16:30:13.039725 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:13.039691 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f60a8ab_e833_4e10_8ef6_e02666b7e615.slice/crio-b709d2a2681ad556a4b161366e36af5096f5499643e57680488dd113e629eb54 WatchSource:0}: Error finding container b709d2a2681ad556a4b161366e36af5096f5499643e57680488dd113e629eb54: Status 404 returned error can't find the container with id b709d2a2681ad556a4b161366e36af5096f5499643e57680488dd113e629eb54 Apr 16 16:30:13.102526 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.102499 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq"] Apr 16 16:30:13.105284 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:13.105257 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeb6ad86_785f_4724_a0f9_ecc5341ee981.slice/crio-c8b39bfd8753933edd8fd72ba3dba0efe6f881eee086a8e469b332dac3574319 WatchSource:0}: Error finding container c8b39bfd8753933edd8fd72ba3dba0efe6f881eee086a8e469b332dac3574319: Status 404 returned error can't find the container with id c8b39bfd8753933edd8fd72ba3dba0efe6f881eee086a8e469b332dac3574319 Apr 16 16:30:13.199668 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.199623 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbbv\"" Apr 16 16:30:13.208522 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.208506 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgk4m" Apr 16 16:30:13.240419 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.240391 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mk2ls\"" Apr 16 16:30:13.248352 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.248328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-btkpq" Apr 16 16:30:13.280562 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.280516 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" event={"ID":"deb6ad86-785f-4724-a0f9-ecc5341ee981","Type":"ContainerStarted","Data":"c8b39bfd8753933edd8fd72ba3dba0efe6f881eee086a8e469b332dac3574319"} Apr 16 16:30:13.281985 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.281939 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" event={"ID":"0f60a8ab-e833-4e10-8ef6-e02666b7e615","Type":"ContainerStarted","Data":"e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11"} Apr 16 16:30:13.281985 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.281968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" event={"ID":"0f60a8ab-e833-4e10-8ef6-e02666b7e615","Type":"ContainerStarted","Data":"b709d2a2681ad556a4b161366e36af5096f5499643e57680488dd113e629eb54"} Apr 16 16:30:13.282150 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.282133 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:30:13.302529 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.302283 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" podStartSLOduration=65.302264442 podStartE2EDuration="1m5.302264442s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:30:13.300491664 +0000 UTC m=+65.884560426" watchObservedRunningTime="2026-04-16 16:30:13.302264442 +0000 UTC m=+65.886333194" Apr 16 16:30:13.340029 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.339875 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgk4m"] Apr 16 16:30:13.342598 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:13.342566 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e4f2b4_f3f9_4e12_b40a_1ab2d4aefaeb.slice/crio-40147d6d72bede795cb24caee8e7aab844fbba647c6cc529bdd35dde23ed0b05 WatchSource:0}: Error finding container 40147d6d72bede795cb24caee8e7aab844fbba647c6cc529bdd35dde23ed0b05: Status 404 returned error can't find the container with id 40147d6d72bede795cb24caee8e7aab844fbba647c6cc529bdd35dde23ed0b05 Apr 16 16:30:13.381493 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.381331 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-btkpq"] Apr 16 16:30:13.383772 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:13.383748 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ba73826_84fd_4301_9663_d9ff7925faa6.slice/crio-747bc6809ff2564f8945fe53d1d2653798ae98cd647e261f645dd9845178e311 WatchSource:0}: Error finding container 747bc6809ff2564f8945fe53d1d2653798ae98cd647e261f645dd9845178e311: Status 404 returned error can't find the container with id 747bc6809ff2564f8945fe53d1d2653798ae98cd647e261f645dd9845178e311 Apr 16 16:30:13.702466 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.702431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:30:13.704578 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.704559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb3c2920-54bb-42cd-a7e2-0c22345e99b4-metrics-certs\") pod \"network-metrics-daemon-vrlk5\" (UID: \"bb3c2920-54bb-42cd-a7e2-0c22345e99b4\") " pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:30:13.997796 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:13.997714 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqp8k\"" Apr 16 16:30:14.005972 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:14.005948 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrlk5" Apr 16 16:30:14.164855 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:14.164800 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vrlk5"] Apr 16 16:30:14.170339 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:14.170307 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb3c2920_54bb_42cd_a7e2_0c22345e99b4.slice/crio-3b2554c8e7d4c13a907bf83cfb503ef0295aec1baa46e4d48e64758daab17fb5 WatchSource:0}: Error finding container 3b2554c8e7d4c13a907bf83cfb503ef0295aec1baa46e4d48e64758daab17fb5: Status 404 returned error can't find the container with id 3b2554c8e7d4c13a907bf83cfb503ef0295aec1baa46e4d48e64758daab17fb5 Apr 16 16:30:14.286750 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:14.286672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vrlk5" event={"ID":"bb3c2920-54bb-42cd-a7e2-0c22345e99b4","Type":"ContainerStarted","Data":"3b2554c8e7d4c13a907bf83cfb503ef0295aec1baa46e4d48e64758daab17fb5"} Apr 16 16:30:14.288572 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:14.288527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-btkpq" event={"ID":"8ba73826-84fd-4301-9663-d9ff7925faa6","Type":"ContainerStarted","Data":"747bc6809ff2564f8945fe53d1d2653798ae98cd647e261f645dd9845178e311"} Apr 16 16:30:14.289891 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:14.289862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgk4m" event={"ID":"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb","Type":"ContainerStarted","Data":"40147d6d72bede795cb24caee8e7aab844fbba647c6cc529bdd35dde23ed0b05"} Apr 16 16:30:16.296920 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.296885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vrlk5" event={"ID":"bb3c2920-54bb-42cd-a7e2-0c22345e99b4","Type":"ContainerStarted","Data":"587976e8694db8db7b1b8c7d22410a29190b2499cf27497eb71fd4b54da0d491"} Apr 16 16:30:16.296920 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.296923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vrlk5" event={"ID":"bb3c2920-54bb-42cd-a7e2-0c22345e99b4","Type":"ContainerStarted","Data":"a7a802ce370f32c0fdda22d76c2e1d3a13ef7f37393f548fa4fa8ce9b631ec39"} Apr 16 16:30:16.298197 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.298173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" event={"ID":"deb6ad86-785f-4724-a0f9-ecc5341ee981","Type":"ContainerStarted","Data":"5a2ee5b748307389c01998bff1434bded6e793d378402f7fb8c0113b5a11d4da"} Apr 16 16:30:16.299422 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.299395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-btkpq" event={"ID":"8ba73826-84fd-4301-9663-d9ff7925faa6","Type":"ContainerStarted","Data":"a0e4b8a67b6fe6231cdfc7b1f296f1e2401ca736ffaf1381864e2dd8f036c7f1"} Apr 16 16:30:16.300948 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.300929 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgk4m" event={"ID":"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb","Type":"ContainerStarted","Data":"6763a55b4f7254d6c9a835c905a26a88727d8b52e9030865257f075d967c2a62"} Apr 16 16:30:16.301043 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.300955 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgk4m" event={"ID":"b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb","Type":"ContainerStarted","Data":"6612e3a7de351fa5908322086bb36370c3acea5a21b4263d3927ad7bc5c37952"} Apr 16 16:30:16.301043 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.301022 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wgk4m" Apr 16 16:30:16.324786 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.324696 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-s8lmq" podStartSLOduration=36.641601139 podStartE2EDuration="39.324682525s" podCreationTimestamp="2026-04-16 16:29:37 +0000 UTC" firstStartedPulling="2026-04-16 16:30:13.107131607 +0000 UTC m=+65.691200334" lastFinishedPulling="2026-04-16 16:30:15.790212979 +0000 UTC m=+68.374281720" observedRunningTime="2026-04-16 16:30:16.323708704 +0000 UTC m=+68.907777452" watchObservedRunningTime="2026-04-16 16:30:16.324682525 +0000 UTC m=+68.908751274" Apr 16 16:30:16.324893 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.324871 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vrlk5" podStartSLOduration=66.654909598 podStartE2EDuration="1m8.324866803s" podCreationTimestamp="2026-04-16 16:29:08 +0000 UTC" firstStartedPulling="2026-04-16 16:30:14.173837194 +0000 UTC m=+66.757905935" lastFinishedPulling="2026-04-16 16:30:15.843794396 +0000 UTC m=+68.427863140" observedRunningTime="2026-04-16 16:30:16.310879432 +0000 UTC m=+68.894948181" watchObservedRunningTime="2026-04-16 16:30:16.324866803 +0000 UTC m=+68.908935553" Apr 16 16:30:16.339417 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.339376 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wgk4m" podStartSLOduration=32.893555997 podStartE2EDuration="35.339364361s" podCreationTimestamp="2026-04-16 16:29:41 +0000 UTC" firstStartedPulling="2026-04-16 16:30:13.344721834 +0000 UTC m=+65.928790568" lastFinishedPulling="2026-04-16 16:30:15.790530199 +0000 UTC m=+68.374598932" observedRunningTime="2026-04-16 16:30:16.338374826 +0000 UTC m=+68.922443574" watchObservedRunningTime="2026-04-16 16:30:16.339364361 +0000 UTC m=+68.923433110" Apr 16 16:30:16.352807 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:16.352770 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-btkpq" podStartSLOduration=32.948021968 podStartE2EDuration="35.352759784s" podCreationTimestamp="2026-04-16 16:29:41 +0000 UTC" firstStartedPulling="2026-04-16 16:30:13.385795249 +0000 UTC m=+65.969863976" lastFinishedPulling="2026-04-16 16:30:15.790533062 +0000 UTC m=+68.374601792" observedRunningTime="2026-04-16 16:30:16.351443956 +0000 UTC m=+68.935512707" watchObservedRunningTime="2026-04-16 16:30:16.352759784 +0000 UTC m=+68.936828530" Apr 16 16:30:17.695727 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.695693 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zd9fq"] Apr 16 16:30:17.698981 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.698961 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9"] Apr 16 16:30:17.699119 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.699100 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.701467 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.701447 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 16:30:17.701573 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.701475 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 16:30:17.701573 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.701478 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-tnfzq\"" Apr 16 16:30:17.701711 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.701598 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:30:17.701711 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.701603 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 16:30:17.701813 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.701792 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh"] Apr 16 16:30:17.701932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.701915 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" Apr 16 16:30:17.704867 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.704850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:17.705838 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.705813 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 16:30:17.705932 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.705903 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:30:17.706160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.706147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-gg56v\"" Apr 16 16:30:17.707267 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.707230 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 16:30:17.707421 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.707403 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-866tx\"" Apr 16 16:30:17.707693 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.707675 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:30:17.707921 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.707888 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 16:30:17.708329 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.708311 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 16:30:17.709402 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.709385 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9"] Apr 16 16:30:17.713076 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.713056 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh"] Apr 16 16:30:17.714191 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.714173 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zd9fq"] Apr 16 16:30:17.735152 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.735132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-serving-cert\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.735248 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.735165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-trusted-ca\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.735248 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.735233 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vtpj\" (UniqueName: \"kubernetes.io/projected/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-kube-api-access-5vtpj\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.735321 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.735262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-config\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.792723 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.792682 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd"] Apr 16 16:30:17.795878 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.795857 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qhdbc"] Apr 16 16:30:17.796027 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.796009 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.798662 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.798637 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5f6556957d-vngdp"] Apr 16 16:30:17.798778 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.798729 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.799850 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.799829 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 16:30:17.799970 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.799867 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 16:30:17.800108 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.800096 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:30:17.800731 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.800719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fwm2c\"" Apr 16 16:30:17.800799 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.800765 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:30:17.801463 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.801443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:30:17.801609 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.801583 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:30:17.801609 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.801594 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.803128 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.803100 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 16:30:17.804341 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.804323 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 16:30:17.804433 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.804424 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 16:30:17.804497 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.804437 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:30:17.804737 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.804721 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 16:30:17.804841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.804827 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-7krlw\"" Apr 16 16:30:17.804910 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.804837 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-g65m2\"" Apr 16 16:30:17.805961 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.805941 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:30:17.806049 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.805963 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 16:30:17.806166 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.806154 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 16:30:17.810008 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.809991 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 16:30:17.817233 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.817205 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qhdbc"] Apr 16 16:30:17.818715 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.818697 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f6556957d-vngdp"] Apr 16 16:30:17.820216 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.820197 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd"] Apr 16 16:30:17.835833 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.835814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.835931 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.835845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-tmp\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.835931 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.835874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ppk\" (UniqueName: \"kubernetes.io/projected/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-kube-api-access-57ppk\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.835931 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.835904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.836038 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.835962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vtpj\" (UniqueName: \"kubernetes.io/projected/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-kube-api-access-5vtpj\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.836038 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.835999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-service-ca-bundle\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.836038 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-stats-auth\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.836134 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-snapshots\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.836134 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836090 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-config\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.836134 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-metrics-certs\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.836259 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwn4m\" (UniqueName: \"kubernetes.io/projected/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-kube-api-access-qwn4m\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.836259 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836176 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/abf23a21-89bf-4e30-9d95-e5faa73177f7-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fpglh\" (UID: \"abf23a21-89bf-4e30-9d95-e5faa73177f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:17.836259 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-default-certificate\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.836259 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-serving-cert\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.836403 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-trusted-ca\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.836403 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-serving-cert\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.836403 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzb5\" (UniqueName: \"kubernetes.io/projected/a7e24cf9-3d97-4d0c-b43a-731997d2fe53-kube-api-access-tpzb5\") pod \"volume-data-source-validator-7d955d5dd4-8jqx9\" (UID: \"a7e24cf9-3d97-4d0c-b43a-731997d2fe53\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" Apr 16 16:30:17.836403 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wr9x\" (UniqueName: \"kubernetes.io/projected/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-kube-api-access-8wr9x\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.836403 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836391 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.836672 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.836672 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssj2\" (UniqueName: \"kubernetes.io/projected/abf23a21-89bf-4e30-9d95-e5faa73177f7-kube-api-access-kssj2\") pod \"cluster-samples-operator-667775844f-fpglh\" (UID: \"abf23a21-89bf-4e30-9d95-e5faa73177f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:17.836915 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.836899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-config\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.837094 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.837079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-trusted-ca\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.838556 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.838541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-serving-cert\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.854710 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.854684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vtpj\" (UniqueName: \"kubernetes.io/projected/4b96129c-a95c-41b4-9236-3a4c2e5bc09d-kube-api-access-5vtpj\") pod \"console-operator-d87b8d5fc-zd9fq\" (UID: \"4b96129c-a95c-41b4-9236-3a4c2e5bc09d\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:17.937077 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-metrics-certs\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.937077 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwn4m\" (UniqueName: \"kubernetes.io/projected/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-kube-api-access-qwn4m\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.937278 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/abf23a21-89bf-4e30-9d95-e5faa73177f7-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fpglh\" (UID: \"abf23a21-89bf-4e30-9d95-e5faa73177f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:17.937278 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-default-certificate\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.937278 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-serving-cert\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.937278 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzb5\" (UniqueName: \"kubernetes.io/projected/a7e24cf9-3d97-4d0c-b43a-731997d2fe53-kube-api-access-tpzb5\") pod \"volume-data-source-validator-7d955d5dd4-8jqx9\" (UID: \"a7e24cf9-3d97-4d0c-b43a-731997d2fe53\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" Apr 16 16:30:17.937278 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wr9x\" (UniqueName: \"kubernetes.io/projected/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-kube-api-access-8wr9x\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.937278 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.937278 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kssj2\" (UniqueName: \"kubernetes.io/projected/abf23a21-89bf-4e30-9d95-e5faa73177f7-kube-api-access-kssj2\") pod \"cluster-samples-operator-667775844f-fpglh\" (UID: \"abf23a21-89bf-4e30-9d95-e5faa73177f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-tmp\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57ppk\" (UniqueName: \"kubernetes.io/projected/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-kube-api-access-57ppk\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-service-ca-bundle\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-stats-auth\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.937618 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.937524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-snapshots\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.938143 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.938120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-tmp\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.938210 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.938173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-snapshots\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.938307 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.938286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.938405 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.938383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-service-ca-bundle\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.938922 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.938899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.939025 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.938931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.940143 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.940116 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-serving-cert\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:17.940267 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.940177 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.940425 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.940405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-metrics-certs\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.940617 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.940593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-default-certificate\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.941012 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.940995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-stats-auth\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.941434 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.941414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/abf23a21-89bf-4e30-9d95-e5faa73177f7-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fpglh\" (UID: \"abf23a21-89bf-4e30-9d95-e5faa73177f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:17.946814 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.946748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssj2\" (UniqueName: \"kubernetes.io/projected/abf23a21-89bf-4e30-9d95-e5faa73177f7-kube-api-access-kssj2\") pod \"cluster-samples-operator-667775844f-fpglh\" (UID: \"abf23a21-89bf-4e30-9d95-e5faa73177f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:17.946907 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.946844 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwn4m\" (UniqueName: \"kubernetes.io/projected/d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c-kube-api-access-qwn4m\") pod \"cluster-monitoring-operator-6667474d89-4jtkd\" (UID: \"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:17.947185 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.947166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ppk\" (UniqueName: \"kubernetes.io/projected/27bbf9b0-7f1e-4f00-be95-79c80f1eae4a-kube-api-access-57ppk\") pod \"router-default-5f6556957d-vngdp\" (UID: \"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a\") " pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:17.947250 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.947200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzb5\" (UniqueName: \"kubernetes.io/projected/a7e24cf9-3d97-4d0c-b43a-731997d2fe53-kube-api-access-tpzb5\") pod \"volume-data-source-validator-7d955d5dd4-8jqx9\" (UID: \"a7e24cf9-3d97-4d0c-b43a-731997d2fe53\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" Apr 16 16:30:17.947574 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:17.947558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wr9x\" (UniqueName: \"kubernetes.io/projected/e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92-kube-api-access-8wr9x\") pod \"insights-operator-5785d4fcdd-qhdbc\" (UID: \"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92\") " pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:18.011050 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.011026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:18.018619 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.018599 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" Apr 16 16:30:18.023257 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.023238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" Apr 16 16:30:18.108069 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.108042 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" Apr 16 16:30:18.112531 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.112157 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" Apr 16 16:30:18.120623 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.119705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:18.205285 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.200379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zd9fq"] Apr 16 16:30:18.205285 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:18.202150 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b96129c_a95c_41b4_9236_3a4c2e5bc09d.slice/crio-6da908f0dc612c479cea0611e6550f257512821caf84d067c414d73d48a7b960 WatchSource:0}: Error finding container 6da908f0dc612c479cea0611e6550f257512821caf84d067c414d73d48a7b960: Status 404 returned error can't find the container with id 6da908f0dc612c479cea0611e6550f257512821caf84d067c414d73d48a7b960 Apr 16 16:30:18.281611 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.281585 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qhdbc"] Apr 16 16:30:18.283201 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:18.283168 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d29cd4_bc8b_4e6f_ae9b_d48e9d231c92.slice/crio-ecc978519c68b031c0b4da4bfdc11d2a18db1f7d004bf31ffff0c9b2bd021a1c WatchSource:0}: Error finding container ecc978519c68b031c0b4da4bfdc11d2a18db1f7d004bf31ffff0c9b2bd021a1c: Status 404 returned error can't find the container with id ecc978519c68b031c0b4da4bfdc11d2a18db1f7d004bf31ffff0c9b2bd021a1c Apr 16 16:30:18.307262 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.307236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" event={"ID":"4b96129c-a95c-41b4-9236-3a4c2e5bc09d","Type":"ContainerStarted","Data":"6da908f0dc612c479cea0611e6550f257512821caf84d067c414d73d48a7b960"} Apr 16 16:30:18.308182 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.308165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" event={"ID":"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92","Type":"ContainerStarted","Data":"ecc978519c68b031c0b4da4bfdc11d2a18db1f7d004bf31ffff0c9b2bd021a1c"} Apr 16 16:30:18.419297 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.419273 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh"] Apr 16 16:30:18.422086 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.422065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9"] Apr 16 16:30:18.426815 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:18.426786 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e24cf9_3d97_4d0c_b43a_731997d2fe53.slice/crio-e2683d9e141bfeacc08921076f4b7a3d738c75910c10d20a26b2637e45e3d16f WatchSource:0}: Error finding container e2683d9e141bfeacc08921076f4b7a3d738c75910c10d20a26b2637e45e3d16f: Status 404 returned error can't find the container with id e2683d9e141bfeacc08921076f4b7a3d738c75910c10d20a26b2637e45e3d16f Apr 16 16:30:18.504383 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.504358 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f6556957d-vngdp"] Apr 16 16:30:18.506580 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:18.506545 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd"] Apr 16 16:30:18.507296 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:18.507268 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27bbf9b0_7f1e_4f00_be95_79c80f1eae4a.slice/crio-b3cd85c04302b19389c93e1e6ba3097dd15ef8454d0ca5f4fc3f61d0e38d90e0 WatchSource:0}: Error finding container b3cd85c04302b19389c93e1e6ba3097dd15ef8454d0ca5f4fc3f61d0e38d90e0: Status 404 returned error can't find the container with id b3cd85c04302b19389c93e1e6ba3097dd15ef8454d0ca5f4fc3f61d0e38d90e0 Apr 16 16:30:18.508789 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:18.508748 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd236fa1f_b4af_4f64_9c14_3fe8a1f8c91c.slice/crio-745f85ea4a5382d6c175c210f5532cee914347e73b027e3213493e04d55b8d19 WatchSource:0}: Error finding container 745f85ea4a5382d6c175c210f5532cee914347e73b027e3213493e04d55b8d19: Status 404 returned error can't find the container with id 745f85ea4a5382d6c175c210f5532cee914347e73b027e3213493e04d55b8d19 Apr 16 16:30:19.314732 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:19.314330 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f6556957d-vngdp" event={"ID":"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a","Type":"ContainerStarted","Data":"5ab5a5e8e17154a4acdd8dfabc3a4b07dee5a38e3a0b424c0a342c0387df4890"} Apr 16 16:30:19.314732 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:19.314372 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f6556957d-vngdp" event={"ID":"27bbf9b0-7f1e-4f00-be95-79c80f1eae4a","Type":"ContainerStarted","Data":"b3cd85c04302b19389c93e1e6ba3097dd15ef8454d0ca5f4fc3f61d0e38d90e0"} Apr 16 16:30:19.319204 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:19.319176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" event={"ID":"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c","Type":"ContainerStarted","Data":"745f85ea4a5382d6c175c210f5532cee914347e73b027e3213493e04d55b8d19"} Apr 16 16:30:19.321170 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:19.321145 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" event={"ID":"abf23a21-89bf-4e30-9d95-e5faa73177f7","Type":"ContainerStarted","Data":"2e176c8ee5cd655f1d5e9e593d47c281202efc23cdc7be98948e2c0f0c3dcd04"} Apr 16 16:30:19.323175 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:19.323135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" event={"ID":"a7e24cf9-3d97-4d0c-b43a-731997d2fe53","Type":"ContainerStarted","Data":"e2683d9e141bfeacc08921076f4b7a3d738c75910c10d20a26b2637e45e3d16f"} Apr 16 16:30:19.335672 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:19.334822 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5f6556957d-vngdp" podStartSLOduration=2.33480635 podStartE2EDuration="2.33480635s" podCreationTimestamp="2026-04-16 16:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:30:19.333319819 +0000 UTC m=+71.917388569" watchObservedRunningTime="2026-04-16 16:30:19.33480635 +0000 UTC m=+71.918875100" Apr 16 16:30:20.121362 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:20.121327 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:20.124450 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:20.124427 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:20.326960 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:20.326919 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:20.328385 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:20.328359 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5f6556957d-vngdp" Apr 16 16:30:21.200284 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:21.200234 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" podUID="160b9aaa-8178-4147-a5f7-c886dd47054c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:30:24.342512 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.342411 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" event={"ID":"4b96129c-a95c-41b4-9236-3a4c2e5bc09d","Type":"ContainerStarted","Data":"3fafd4526b2dd78e39fece1b6abb6a15f93f2631be88c5c62a6526445bc66fb9"} Apr 16 16:30:24.342983 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.342886 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:24.344629 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.344591 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" event={"ID":"abf23a21-89bf-4e30-9d95-e5faa73177f7","Type":"ContainerStarted","Data":"c9c0227f6ca31b8322f3b000b7889c2fdb80e049cc3a4f6e099d32f6157f82ee"} Apr 16 16:30:24.344629 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.344623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" event={"ID":"abf23a21-89bf-4e30-9d95-e5faa73177f7","Type":"ContainerStarted","Data":"4f0704643aabce90eeafc524daa9dd14b700b3ef0c1059131f2f8dd2db7413fb"} Apr 16 16:30:24.346520 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.346120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" event={"ID":"a7e24cf9-3d97-4d0c-b43a-731997d2fe53","Type":"ContainerStarted","Data":"42cb7184bc2b617aead8222b932a4c43784e3ea8013b5d2c3adf9dd829827a6c"} Apr 16 16:30:24.347894 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.347860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" event={"ID":"e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92","Type":"ContainerStarted","Data":"defd549519ec652b02ddbf4918485b4f62367e2371447af2e63da6b01edb1443"} Apr 16 16:30:24.349467 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.349446 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" event={"ID":"d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c","Type":"ContainerStarted","Data":"98b4aeb150202c1a7a8efd25b11973b29d65cd5509f4597d643185185ed2d01d"} Apr 16 16:30:24.360441 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.360396 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" podStartSLOduration=1.918252351 podStartE2EDuration="7.360381226s" podCreationTimestamp="2026-04-16 16:30:17 +0000 UTC" firstStartedPulling="2026-04-16 16:30:18.204922256 +0000 UTC m=+70.788990990" lastFinishedPulling="2026-04-16 16:30:23.64705112 +0000 UTC m=+76.231119865" observedRunningTime="2026-04-16 16:30:24.358554572 +0000 UTC m=+76.942623320" watchObservedRunningTime="2026-04-16 16:30:24.360381226 +0000 UTC m=+76.944449981" Apr 16 16:30:24.374600 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.374553 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4jtkd" podStartSLOduration=2.238292691 podStartE2EDuration="7.37453758s" podCreationTimestamp="2026-04-16 16:30:17 +0000 UTC" firstStartedPulling="2026-04-16 16:30:18.510443481 +0000 UTC m=+71.094512207" lastFinishedPulling="2026-04-16 16:30:23.646688369 +0000 UTC m=+76.230757096" observedRunningTime="2026-04-16 16:30:24.372520197 +0000 UTC m=+76.956588943" watchObservedRunningTime="2026-04-16 16:30:24.37453758 +0000 UTC m=+76.958606330" Apr 16 16:30:24.403070 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.403028 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-8jqx9" podStartSLOduration=2.185063275 podStartE2EDuration="7.403018967s" podCreationTimestamp="2026-04-16 16:30:17 +0000 UTC" firstStartedPulling="2026-04-16 16:30:18.428383594 +0000 UTC m=+71.012452321" lastFinishedPulling="2026-04-16 16:30:23.646339271 +0000 UTC m=+76.230408013" observedRunningTime="2026-04-16 16:30:24.386471945 +0000 UTC m=+76.970540698" watchObservedRunningTime="2026-04-16 16:30:24.403018967 +0000 UTC m=+76.987087716" Apr 16 16:30:24.403351 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.403321 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fpglh" podStartSLOduration=2.21268167 podStartE2EDuration="7.403313802s" podCreationTimestamp="2026-04-16 16:30:17 +0000 UTC" firstStartedPulling="2026-04-16 16:30:18.456416947 +0000 UTC m=+71.040485675" lastFinishedPulling="2026-04-16 16:30:23.647049066 +0000 UTC m=+76.231117807" observedRunningTime="2026-04-16 16:30:24.402344666 +0000 UTC m=+76.986413428" watchObservedRunningTime="2026-04-16 16:30:24.403313802 +0000 UTC m=+76.987382551" Apr 16 16:30:24.419923 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.419872 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-qhdbc" podStartSLOduration=2.052653102 podStartE2EDuration="7.419865431s" podCreationTimestamp="2026-04-16 16:30:17 +0000 UTC" firstStartedPulling="2026-04-16 16:30:18.285037582 +0000 UTC m=+70.869106309" lastFinishedPulling="2026-04-16 16:30:23.652249906 +0000 UTC m=+76.236318638" observedRunningTime="2026-04-16 16:30:24.418742642 +0000 UTC m=+77.002811401" watchObservedRunningTime="2026-04-16 16:30:24.419865431 +0000 UTC m=+77.003934182" Apr 16 16:30:24.770547 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:24.770520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zd9fq" Apr 16 16:30:26.305918 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:26.305892 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wgk4m" Apr 16 16:30:26.585566 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:26.585500 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wgk4m_b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb/dns/0.log" Apr 16 16:30:26.771832 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:26.771805 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wgk4m_b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb/kube-rbac-proxy/0.log" Apr 16 16:30:26.971602 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:26.971577 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-78zgg_a2387338-f572-4af1-a326-edcc55b8077f/dns-node-resolver/0.log" Apr 16 16:30:27.233421 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.233352 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xzlm9" Apr 16 16:30:27.574254 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.574113 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-65c845f5f8-7c7zn_0f60a8ab-e833-4e10-8ef6-e02666b7e615/registry/0.log" Apr 16 16:30:27.965503 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.965471 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nknjf"] Apr 16 16:30:27.991874 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.991843 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nknjf"] Apr 16 16:30:27.992014 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.991902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:27.994938 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.994912 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:30:27.994938 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.994930 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:30:27.995120 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:27.994934 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tj77f\"" Apr 16 16:30:28.129202 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.129171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.129375 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.129223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.129375 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.129244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njgk9\" (UniqueName: \"kubernetes.io/projected/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-kube-api-access-njgk9\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.129375 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.129289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-data-volume\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.129375 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.129311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-crio-socket\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.172403 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.172376 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pkd9h_dc8e5144-a514-40ce-8e9e-c4b13ea6d7af/node-ca/0.log" Apr 16 16:30:28.230032 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.229953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-data-volume\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.230032 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.229999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-crio-socket\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.230212 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.230073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-crio-socket\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.230212 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.230120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.230212 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.230199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.230338 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.230226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njgk9\" (UniqueName: \"kubernetes.io/projected/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-kube-api-access-njgk9\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.230338 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.230243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-data-volume\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.230735 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.230713 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.232356 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.232336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.239063 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.239044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njgk9\" (UniqueName: \"kubernetes.io/projected/7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3-kube-api-access-njgk9\") pod \"insights-runtime-extractor-nknjf\" (UID: \"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3\") " pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.301268 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.301244 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nknjf" Apr 16 16:30:28.415152 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.415117 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nknjf"] Apr 16 16:30:28.417918 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:28.417890 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e0f8ceb_99a1_42ac_8beb_298e0e5ce9a3.slice/crio-714f78976faf5666234635b65e3228bb3959c1923d2afcf5f7f2fc4d827c397d WatchSource:0}: Error finding container 714f78976faf5666234635b65e3228bb3959c1923d2afcf5f7f2fc4d827c397d: Status 404 returned error can't find the container with id 714f78976faf5666234635b65e3228bb3959c1923d2afcf5f7f2fc4d827c397d Apr 16 16:30:28.574682 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.574589 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f6556957d-vngdp_27bbf9b0-7f1e-4f00-be95-79c80f1eae4a/router/0.log" Apr 16 16:30:28.972166 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:28.972136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-btkpq_8ba73826-84fd-4301-9663-d9ff7925faa6/serve-healthcheck-canary/0.log" Apr 16 16:30:29.367557 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:29.367483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nknjf" event={"ID":"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3","Type":"ContainerStarted","Data":"1ce9e100c291d63b604f0f4d501c11b48e7b798330611f590c5d4eed302c58d6"} Apr 16 16:30:29.367557 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:29.367521 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nknjf" event={"ID":"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3","Type":"ContainerStarted","Data":"714f78976faf5666234635b65e3228bb3959c1923d2afcf5f7f2fc4d827c397d"} Apr 16 16:30:30.375716 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:30.375672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nknjf" event={"ID":"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3","Type":"ContainerStarted","Data":"2d56d357d4d2a8c9181389a6c8322ca24b46cfc954b3892d7e40bb93a1a9f5ac"} Apr 16 16:30:31.200002 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:31.199956 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" podUID="160b9aaa-8178-4147-a5f7-c886dd47054c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:30:31.200206 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:31.200039 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" Apr 16 16:30:31.200724 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:31.200641 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"57907be2db0e926b09f9969dfe7b00e493538bc3482ccc703ce6cbc40b8e5378"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 16:30:31.200860 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:31.200754 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" podUID="160b9aaa-8178-4147-a5f7-c886dd47054c" containerName="service-proxy" containerID="cri-o://57907be2db0e926b09f9969dfe7b00e493538bc3482ccc703ce6cbc40b8e5378" gracePeriod=30 Apr 16 16:30:31.381160 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:31.381137 2573 generic.go:358] "Generic (PLEG): container finished" podID="160b9aaa-8178-4147-a5f7-c886dd47054c" containerID="57907be2db0e926b09f9969dfe7b00e493538bc3482ccc703ce6cbc40b8e5378" exitCode=2 Apr 16 16:30:31.381461 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:31.381190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" event={"ID":"160b9aaa-8178-4147-a5f7-c886dd47054c","Type":"ContainerDied","Data":"57907be2db0e926b09f9969dfe7b00e493538bc3482ccc703ce6cbc40b8e5378"} Apr 16 16:30:32.385537 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:32.385499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nknjf" event={"ID":"7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3","Type":"ContainerStarted","Data":"90af4d1bda37aba15e82c10e919a22e7a62eba3b14aa6794cbcc680d305c2562"} Apr 16 16:30:32.387372 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:32.387346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6fd87b8-mqjln" event={"ID":"160b9aaa-8178-4147-a5f7-c886dd47054c","Type":"ContainerStarted","Data":"327911f433435a5f7836df3e6ad68a25db21e3874fd2ae314063ecc45068c9df"} Apr 16 16:30:32.402619 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:32.402576 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nknjf" podStartSLOduration=2.459809382 podStartE2EDuration="5.402564383s" podCreationTimestamp="2026-04-16 16:30:27 +0000 UTC" firstStartedPulling="2026-04-16 16:30:28.504646319 +0000 UTC m=+81.088715047" lastFinishedPulling="2026-04-16 16:30:31.447401303 +0000 UTC m=+84.031470048" observedRunningTime="2026-04-16 16:30:32.401132371 +0000 UTC m=+84.985201131" watchObservedRunningTime="2026-04-16 16:30:32.402564383 +0000 UTC m=+84.986633132" Apr 16 16:30:32.921004 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:32.920972 2573 patch_prober.go:28] interesting pod/image-registry-65c845f5f8-7c7zn container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:30:32.921150 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:32.921028 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" podUID="0f60a8ab-e833-4e10-8ef6-e02666b7e615" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:30:34.294568 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:34.294537 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:30:35.703198 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.703147 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8jqkf"] Apr 16 16:30:35.707080 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.706958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.709636 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.709612 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:30:35.709882 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.709857 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-95zfb\"" Apr 16 16:30:35.710218 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.710202 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:30:35.710870 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.710845 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:30:35.711073 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.711040 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:30:35.798416 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-sys\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798617 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-wtmp\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798617 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798617 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798549 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjxz\" (UniqueName: \"kubernetes.io/projected/efab21b1-d5b4-416a-9fad-13106ed37247-kube-api-access-kjjxz\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798617 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-root\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-textfile\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-tls\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.798841 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.798723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efab21b1-d5b4-416a-9fad-13106ed37247-metrics-client-ca\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.901829 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.901888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-root\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.901918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-textfile\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.901951 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-tls\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.901980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efab21b1-d5b4-416a-9fad-13106ed37247-metrics-client-ca\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.902027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-sys\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.902103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-wtmp\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.902128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.902406 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.902153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjxz\" (UniqueName: \"kubernetes.io/projected/efab21b1-d5b4-416a-9fad-13106ed37247-kube-api-access-kjjxz\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.903350 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.903322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-root\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.903487 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.903460 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-wtmp\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.903610 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.903532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efab21b1-d5b4-416a-9fad-13106ed37247-sys\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.903610 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.903570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efab21b1-d5b4-416a-9fad-13106ed37247-metrics-client-ca\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.904065 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:30:35.903335 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:30:35.904065 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:30:35.903833 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-tls podName:efab21b1-d5b4-416a-9fad-13106ed37247 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:36.403809523 +0000 UTC m=+88.987878251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-tls") pod "node-exporter-8jqkf" (UID: "efab21b1-d5b4-416a-9fad-13106ed37247") : secret "node-exporter-tls" not found Apr 16 16:30:35.904065 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.903815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-textfile\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.904305 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.904178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.906122 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.906079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:35.915119 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:35.915092 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjxz\" (UniqueName: \"kubernetes.io/projected/efab21b1-d5b4-416a-9fad-13106ed37247-kube-api-access-kjjxz\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:36.407680 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:36.407631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-tls\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:36.410144 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:36.410114 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/efab21b1-d5b4-416a-9fad-13106ed37247-node-exporter-tls\") pod \"node-exporter-8jqkf\" (UID: \"efab21b1-d5b4-416a-9fad-13106ed37247\") " pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:36.620677 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:36.620633 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8jqkf" Apr 16 16:30:36.634475 ip-10-0-139-12 kubenswrapper[2573]: W0416 16:30:36.634398 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefab21b1_d5b4_416a_9fad_13106ed37247.slice/crio-8fca21b99c24717b1ead2c9ebf00984c24d9d5bd5f26b49b298c5d9654d730ef WatchSource:0}: Error finding container 8fca21b99c24717b1ead2c9ebf00984c24d9d5bd5f26b49b298c5d9654d730ef: Status 404 returned error can't find the container with id 8fca21b99c24717b1ead2c9ebf00984c24d9d5bd5f26b49b298c5d9654d730ef Apr 16 16:30:37.404967 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:37.404929 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jqkf" event={"ID":"efab21b1-d5b4-416a-9fad-13106ed37247","Type":"ContainerStarted","Data":"8fca21b99c24717b1ead2c9ebf00984c24d9d5bd5f26b49b298c5d9654d730ef"} Apr 16 16:30:38.409680 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:38.409623 2573 generic.go:358] "Generic (PLEG): container finished" podID="efab21b1-d5b4-416a-9fad-13106ed37247" containerID="412bea3692dc36db7739b18116c44ebe46cb53c23a7766bc466be8a3f0b6a553" exitCode=0 Apr 16 16:30:38.410118 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:38.409693 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jqkf" event={"ID":"efab21b1-d5b4-416a-9fad-13106ed37247","Type":"ContainerDied","Data":"412bea3692dc36db7739b18116c44ebe46cb53c23a7766bc466be8a3f0b6a553"} Apr 16 16:30:39.417991 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:39.417913 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jqkf" event={"ID":"efab21b1-d5b4-416a-9fad-13106ed37247","Type":"ContainerStarted","Data":"b602bf94a8c7f997e367f1ff05b10231ccc20758a092bb659d09e404a90ecb67"} Apr 16 16:30:39.417991 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:39.417962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jqkf" event={"ID":"efab21b1-d5b4-416a-9fad-13106ed37247","Type":"ContainerStarted","Data":"be582509d7f033a5f2f0ecc7ad921192fce3e459c31c90e6451a5960cc432cb4"} Apr 16 16:30:39.436209 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:39.436154 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8jqkf" podStartSLOduration=3.632431216 podStartE2EDuration="4.436135188s" podCreationTimestamp="2026-04-16 16:30:35 +0000 UTC" firstStartedPulling="2026-04-16 16:30:36.636809345 +0000 UTC m=+89.220878072" lastFinishedPulling="2026-04-16 16:30:37.440513301 +0000 UTC m=+90.024582044" observedRunningTime="2026-04-16 16:30:39.435845599 +0000 UTC m=+92.019914585" watchObservedRunningTime="2026-04-16 16:30:39.436135188 +0000 UTC m=+92.020203937" Apr 16 16:30:44.905985 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:30:44.905955 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65c845f5f8-7c7zn"] Apr 16 16:31:09.929769 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:09.929708 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" podUID="0f60a8ab-e833-4e10-8ef6-e02666b7e615" containerName="registry" containerID="cri-o://e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11" gracePeriod=30 Apr 16 16:31:10.171822 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.171796 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:31:10.272161 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-certificates\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272161 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272131 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-trusted-ca\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272363 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272161 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r78n\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-kube-api-access-7r78n\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272363 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272267 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f60a8ab-e833-4e10-8ef6-e02666b7e615-ca-trust-extracted\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272363 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272334 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-installation-pull-secrets\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272512 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272368 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272512 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272409 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-image-registry-private-configuration\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272512 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272436 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-bound-sa-token\") pod \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\" (UID: \"0f60a8ab-e833-4e10-8ef6-e02666b7e615\") " Apr 16 16:31:10.272688 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272620 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:10.272688 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.272617 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:10.274812 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.274782 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:10.274926 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.274838 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:10.274926 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.274862 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:10.274926 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.274880 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-kube-api-access-7r78n" (OuterVolumeSpecName: "kube-api-access-7r78n") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "kube-api-access-7r78n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:10.275044 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.275021 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:10.281383 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.281359 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f60a8ab-e833-4e10-8ef6-e02666b7e615-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0f60a8ab-e833-4e10-8ef6-e02666b7e615" (UID: "0f60a8ab-e833-4e10-8ef6-e02666b7e615"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:10.373647 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373622 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-installation-pull-secrets\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.373647 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373644 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-tls\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.373805 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373672 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f60a8ab-e833-4e10-8ef6-e02666b7e615-image-registry-private-configuration\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.373805 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373699 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-bound-sa-token\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.373805 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373709 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-registry-certificates\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.373805 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373717 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f60a8ab-e833-4e10-8ef6-e02666b7e615-trusted-ca\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.373805 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373726 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7r78n\" (UniqueName: \"kubernetes.io/projected/0f60a8ab-e833-4e10-8ef6-e02666b7e615-kube-api-access-7r78n\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.373805 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.373735 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f60a8ab-e833-4e10-8ef6-e02666b7e615-ca-trust-extracted\") on node \"ip-10-0-139-12.ec2.internal\" DevicePath \"\"" Apr 16 16:31:10.508080 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.508051 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f60a8ab-e833-4e10-8ef6-e02666b7e615" containerID="e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11" exitCode=0 Apr 16 16:31:10.508202 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.508122 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" event={"ID":"0f60a8ab-e833-4e10-8ef6-e02666b7e615","Type":"ContainerDied","Data":"e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11"} Apr 16 16:31:10.508202 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.508143 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" event={"ID":"0f60a8ab-e833-4e10-8ef6-e02666b7e615","Type":"ContainerDied","Data":"b709d2a2681ad556a4b161366e36af5096f5499643e57680488dd113e629eb54"} Apr 16 16:31:10.508202 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.508158 2573 scope.go:117] "RemoveContainer" containerID="e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11" Apr 16 16:31:10.508202 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.508123 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65c845f5f8-7c7zn" Apr 16 16:31:10.516323 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.516303 2573 scope.go:117] "RemoveContainer" containerID="e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11" Apr 16 16:31:10.516571 ip-10-0-139-12 kubenswrapper[2573]: E0416 16:31:10.516551 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11\": container with ID starting with e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11 not found: ID does not exist" containerID="e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11" Apr 16 16:31:10.516629 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.516582 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11"} err="failed to get container status \"e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11\": rpc error: code = NotFound desc = could not find container \"e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11\": container with ID starting with e747aca1d6a04b4f44b37266fc3aa0cab2395a3f3a2514a8ffb3ebbcf4dc6a11 not found: ID does not exist" Apr 16 16:31:10.531925 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.531848 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65c845f5f8-7c7zn"] Apr 16 16:31:10.533869 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:10.533848 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-65c845f5f8-7c7zn"] Apr 16 16:31:11.512921 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:11.512839 2573 generic.go:358] "Generic (PLEG): container finished" podID="38423965-6637-4f51-89af-e465c2846faf" containerID="7ecc84381aab7f6f9e15cf79f95672e1ba081c5a6598d463a00a75201b03df04" exitCode=0 Apr 16 16:31:11.512921 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:11.512897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" event={"ID":"38423965-6637-4f51-89af-e465c2846faf","Type":"ContainerDied","Data":"7ecc84381aab7f6f9e15cf79f95672e1ba081c5a6598d463a00a75201b03df04"} Apr 16 16:31:11.513327 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:11.513156 2573 scope.go:117] "RemoveContainer" containerID="7ecc84381aab7f6f9e15cf79f95672e1ba081c5a6598d463a00a75201b03df04" Apr 16 16:31:11.977104 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:11.977069 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f60a8ab-e833-4e10-8ef6-e02666b7e615" path="/var/lib/kubelet/pods/0f60a8ab-e833-4e10-8ef6-e02666b7e615/volumes" Apr 16 16:31:12.516898 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:12.516865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-b8wtr" event={"ID":"38423965-6637-4f51-89af-e465c2846faf","Type":"ContainerStarted","Data":"12fbfc5c0ada9f48061c383dcd0a855db25dd1affad6d4845eb49a8e4a48a901"} Apr 16 16:31:26.558854 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:26.558817 2573 generic.go:358] "Generic (PLEG): container finished" podID="91b28333-7ad2-40bf-b991-a75717a9ead4" containerID="1ed5f91797e7a918d7a3bb77baf9aa9545d290bcafbb19948843345cfc7bcfee" exitCode=0 Apr 16 16:31:26.559253 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:26.558867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" event={"ID":"91b28333-7ad2-40bf-b991-a75717a9ead4","Type":"ContainerDied","Data":"1ed5f91797e7a918d7a3bb77baf9aa9545d290bcafbb19948843345cfc7bcfee"} Apr 16 16:31:26.559253 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:26.559136 2573 scope.go:117] "RemoveContainer" containerID="1ed5f91797e7a918d7a3bb77baf9aa9545d290bcafbb19948843345cfc7bcfee" Apr 16 16:31:27.562744 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:31:27.562707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xx6cp" event={"ID":"91b28333-7ad2-40bf-b991-a75717a9ead4","Type":"ContainerStarted","Data":"de0198577ba9116dd297199ae8b739bc9ae7d629c96267af10a74c9d921da54f"} Apr 16 16:34:07.904253 ip-10-0-139-12 kubenswrapper[2573]: I0416 16:34:07.904227 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:27:59.772864 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:27:59.772836 2573 ???:1] "http: TLS handshake error from 10.0.139.12:58520: EOF" Apr 16 17:27:59.779494 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:27:59.779471 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q86dl_a8bd6f6a-c52e-44cd-8d08-c4d8f3a1d791/global-pull-secret-syncer/0.log" Apr 16 17:27:59.827816 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:27:59.827788 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pgpjj_f931d8c4-2e7a-44a0-8e0b-ee392349e62d/konnectivity-agent/0.log" Apr 16 17:27:59.955486 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:27:59.955460 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-12.ec2.internal_f2b918a444bf444231f9bc2675345c3c/haproxy/0.log" Apr 16 17:28:02.994711 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:02.994686 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-4jtkd_d236fa1f-b4af-4f64-9c14-3fe8a1f8c91c/cluster-monitoring-operator/0.log" Apr 16 17:28:03.196472 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:03.196442 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jqkf_efab21b1-d5b4-416a-9fad-13106ed37247/node-exporter/0.log" Apr 16 17:28:03.218500 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:03.218475 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jqkf_efab21b1-d5b4-416a-9fad-13106ed37247/kube-rbac-proxy/0.log" Apr 16 17:28:03.245992 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:03.245934 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jqkf_efab21b1-d5b4-416a-9fad-13106ed37247/init-textfile/0.log" Apr 16 17:28:05.109460 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:05.109396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-s8lmq_deb6ad86-785f-4724-a0f9-ecc5341ee981/networking-console-plugin/0.log" Apr 16 17:28:05.583061 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:05.583032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zd9fq_4b96129c-a95c-41b4-9236-3a4c2e5bc09d/console-operator/0.log" Apr 16 17:28:06.398267 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:06.398237 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-8jqx9_a7e24cf9-3d97-4d0c-b43a-731997d2fe53/volume-data-source-validator/0.log" Apr 16 17:28:07.110617 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.110592 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wgk4m_b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb/dns/0.log" Apr 16 17:28:07.128531 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.128506 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wgk4m_b2e4f2b4-f3f9-4e12-b40a-1ab2d4aefaeb/kube-rbac-proxy/0.log" Apr 16 17:28:07.150415 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.150395 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-78zgg_a2387338-f572-4af1-a326-edcc55b8077f/dns-node-resolver/0.log" Apr 16 17:28:07.209915 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.209889 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r"] Apr 16 17:28:07.210190 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.210177 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f60a8ab-e833-4e10-8ef6-e02666b7e615" containerName="registry" Apr 16 17:28:07.210232 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.210192 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f60a8ab-e833-4e10-8ef6-e02666b7e615" containerName="registry" Apr 16 17:28:07.210276 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.210250 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f60a8ab-e833-4e10-8ef6-e02666b7e615" containerName="registry" Apr 16 17:28:07.213242 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.213224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.215803 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.215785 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qzsql\"/\"openshift-service-ca.crt\"" Apr 16 17:28:07.215888 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.215822 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qzsql\"/\"default-dockercfg-vd8tg\"" Apr 16 17:28:07.216718 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.216701 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qzsql\"/\"kube-root-ca.crt\"" Apr 16 17:28:07.221977 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.221947 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r"] Apr 16 17:28:07.289409 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.289382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-podres\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.289409 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.289411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-sys\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.289594 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.289449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqsq2\" (UniqueName: \"kubernetes.io/projected/fe09741a-4e3b-4216-940a-c901faa95359-kube-api-access-tqsq2\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.289594 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.289520 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-proc\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.289594 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.289570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-lib-modules\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.389872 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.389849 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-podres\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.389962 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.389877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-sys\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.389962 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.389912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqsq2\" (UniqueName: \"kubernetes.io/projected/fe09741a-4e3b-4216-940a-c901faa95359-kube-api-access-tqsq2\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.390065 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.389971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-sys\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.390065 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.390005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-proc\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.390065 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.390031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-lib-modules\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.390188 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.390026 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-podres\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.390188 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.390072 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-proc\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.390188 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.390165 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe09741a-4e3b-4216-940a-c901faa95359-lib-modules\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.397602 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.397583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqsq2\" (UniqueName: \"kubernetes.io/projected/fe09741a-4e3b-4216-940a-c901faa95359-kube-api-access-tqsq2\") pod \"perf-node-gather-daemonset-2gr7r\" (UID: \"fe09741a-4e3b-4216-940a-c901faa95359\") " pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.522867 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.522832 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.638983 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.638959 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r"] Apr 16 17:28:07.641321 ip-10-0-139-12 kubenswrapper[2573]: W0416 17:28:07.641262 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe09741a_4e3b_4216_940a_c901faa95359.slice/crio-ac913b7b591e1cb12b4060814bc98c221010c7dd63660762e77af4e1447fb430 WatchSource:0}: Error finding container ac913b7b591e1cb12b4060814bc98c221010c7dd63660762e77af4e1447fb430: Status 404 returned error can't find the container with id ac913b7b591e1cb12b4060814bc98c221010c7dd63660762e77af4e1447fb430 Apr 16 17:28:07.642992 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.642972 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:28:07.688928 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.688909 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pkd9h_dc8e5144-a514-40ce-8e9e-c4b13ea6d7af/node-ca/0.log" Apr 16 17:28:07.824444 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.824416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" event={"ID":"fe09741a-4e3b-4216-940a-c901faa95359","Type":"ContainerStarted","Data":"f92f2404bb99a9baa809ddfa454a7e46b1da0e195ca3b4f8370a59ef9b3fd9fa"} Apr 16 17:28:07.824444 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.824447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" event={"ID":"fe09741a-4e3b-4216-940a-c901faa95359","Type":"ContainerStarted","Data":"ac913b7b591e1cb12b4060814bc98c221010c7dd63660762e77af4e1447fb430"} Apr 16 17:28:07.824638 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.824528 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:07.839629 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:07.839591 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" podStartSLOduration=0.839579686 podStartE2EDuration="839.579686ms" podCreationTimestamp="2026-04-16 17:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:28:07.838047107 +0000 UTC m=+3540.422115896" watchObservedRunningTime="2026-04-16 17:28:07.839579686 +0000 UTC m=+3540.423648434" Apr 16 17:28:08.394589 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:08.394559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f6556957d-vngdp_27bbf9b0-7f1e-4f00-be95-79c80f1eae4a/router/0.log" Apr 16 17:28:08.723755 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:08.723615 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-btkpq_8ba73826-84fd-4301-9663-d9ff7925faa6/serve-healthcheck-canary/0.log" Apr 16 17:28:09.040448 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:09.040364 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-qhdbc_e8d29cd4-bc8b-4e6f-ae9b-d48e9d231c92/insights-operator/0.log" Apr 16 17:28:09.173347 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:09.173324 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nknjf_7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3/kube-rbac-proxy/0.log" Apr 16 17:28:09.191127 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:09.191106 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nknjf_7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3/exporter/0.log" Apr 16 17:28:09.209007 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:09.208989 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nknjf_7e0f8ceb-99a1-42ac-8beb-298e0e5ce9a3/extractor/0.log" Apr 16 17:28:13.836662 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:13.836620 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qzsql/perf-node-gather-daemonset-2gr7r" Apr 16 17:28:15.715166 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:15.715071 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-78npn_575d205c-5915-427a-881e-800cf69e2195/migrator/0.log" Apr 16 17:28:15.738664 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:15.738622 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-78npn_575d205c-5915-427a-881e-800cf69e2195/graceful-termination/0.log" Apr 16 17:28:16.048428 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.048349 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-b8wtr_38423965-6637-4f51-89af-e465c2846faf/kube-storage-version-migrator-operator/1.log" Apr 16 17:28:16.049184 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.049164 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-b8wtr_38423965-6637-4f51-89af-e465c2846faf/kube-storage-version-migrator-operator/0.log" Apr 16 17:28:16.846866 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.846837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xh5k_6839f3b4-61f8-4813-988a-8f23df71a050/kube-multus/0.log" Apr 16 17:28:16.869959 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.869938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vvmn_34f77f7d-b36d-4041-8328-4e6971055b19/kube-multus-additional-cni-plugins/0.log" Apr 16 17:28:16.888778 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.888759 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vvmn_34f77f7d-b36d-4041-8328-4e6971055b19/egress-router-binary-copy/0.log" Apr 16 17:28:16.906204 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.906189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vvmn_34f77f7d-b36d-4041-8328-4e6971055b19/cni-plugins/0.log" Apr 16 17:28:16.926133 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.926119 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vvmn_34f77f7d-b36d-4041-8328-4e6971055b19/bond-cni-plugin/0.log" Apr 16 17:28:16.948260 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.948237 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vvmn_34f77f7d-b36d-4041-8328-4e6971055b19/routeoverride-cni/0.log" Apr 16 17:28:16.971426 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.971407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vvmn_34f77f7d-b36d-4041-8328-4e6971055b19/whereabouts-cni-bincopy/0.log" Apr 16 17:28:16.993120 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:16.993100 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vvmn_34f77f7d-b36d-4041-8328-4e6971055b19/whereabouts-cni/0.log" Apr 16 17:28:17.436876 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:17.436848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vrlk5_bb3c2920-54bb-42cd-a7e2-0c22345e99b4/network-metrics-daemon/0.log" Apr 16 17:28:17.456727 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:17.456704 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vrlk5_bb3c2920-54bb-42cd-a7e2-0c22345e99b4/kube-rbac-proxy/0.log" Apr 16 17:28:18.537970 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.537940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/ovn-controller/0.log" Apr 16 17:28:18.571382 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.571354 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/ovn-acl-logging/0.log" Apr 16 17:28:18.589489 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.589467 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/kube-rbac-proxy-node/0.log" Apr 16 17:28:18.607968 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.607938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:28:18.623996 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.623972 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/northd/0.log" Apr 16 17:28:18.643279 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.643259 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/nbdb/0.log" Apr 16 17:28:18.661052 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.661018 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/sbdb/0.log" Apr 16 17:28:18.753500 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:18.753461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sg9b4_f0049088-3d94-496a-ba74-286b0c64fa8a/ovnkube-controller/0.log" Apr 16 17:28:20.035049 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:20.035018 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-zrwnz_fb9f58f2-5c1e-437f-8ea7-a75f424536f4/check-endpoints/0.log" Apr 16 17:28:20.094936 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:20.094909 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xzlm9_6e5b8c51-4e68-4bf7-bb21-ec500c600441/network-check-target-container/0.log" Apr 16 17:28:20.891440 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:20.891411 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-26g6g_db546d9e-9684-452f-876d-0d945addae4c/iptables-alerter/0.log" Apr 16 17:28:21.545195 ip-10-0-139-12 kubenswrapper[2573]: I0416 17:28:21.545124 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6t9jb_8ad1af0f-4c11-46e4-9fcc-8e22841d5e0e/tuned/0.log"