Apr 22 18:43:31.003592 ip-10-0-133-42 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:43:31.003602 ip-10-0-133-42 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:43:31.003612 ip-10-0-133-42 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:43:31.003983 ip-10-0-133-42 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:43:41.215946 ip-10-0-133-42 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:43:41.215963 ip-10-0-133-42 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5bd72a0fad174dd7bcb45a9e4ab23b68 -- Apr 22 18:45:47.790875 ip-10-0-133-42 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:45:48.200380 ip-10-0-133-42 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:45:48.200380 ip-10-0-133-42 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:45:48.200380 ip-10-0-133-42 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:45:48.200380 ip-10-0-133-42 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:45:48.200380 ip-10-0-133-42 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:45:48.203348 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.203262 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:45:48.206228 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206213 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:45:48.206228 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206228 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206231 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206235 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206237 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206240 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206243 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206246 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206249 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206251 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206254 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206257 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206259 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206262 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206265 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206267 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206272 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206274 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206278 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206280 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206284 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:45:48.206292 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206286 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206289 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206292 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206294 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206297 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206300 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206303 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206306 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206309 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206312 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206314 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206317 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206320 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206322 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206325 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206328 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206330 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206332 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206335 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:45:48.206792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206338 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206340 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206344 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206348 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206351 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206354 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206357 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206360 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206362 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206365 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206368 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206372 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206375 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206377 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206380 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206383 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206386 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206388 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206391 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:45:48.207303 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206394 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206396 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206406 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206410 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206413 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206415 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206418 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206421 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206424 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206426 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206432 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206436 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206439 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206442 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206444 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206447 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206449 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206452 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206454 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:45:48.207856 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206457 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206459 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206462 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206464 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206468 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206471 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206474 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206476 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206870 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206877 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206880 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206883 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206886 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206889 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206891 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206894 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206896 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206899 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206902 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206904 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:45:48.208313 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206907 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206909 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206912 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206914 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206917 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206920 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206922 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206925 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206927 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206930 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206932 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206935 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206937 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206940 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206942 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206945 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206948 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206951 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206954 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206956 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:45:48.208809 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206959 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206961 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206964 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206967 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206969 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206972 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206974 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206977 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206979 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206982 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206984 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206989 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206992 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206995 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.206997 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207000 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207003 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207005 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207008 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207011 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:45:48.209334 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207014 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207017 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207019 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207022 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207024 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207027 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207030 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207033 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207035 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207038 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207041 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207043 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207046 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207048 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207051 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207053 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207055 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207058 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207060 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:45:48.209850 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207063 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207065 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207069 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207072 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207075 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207077 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207081 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207083 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207086 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207088 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207091 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207093 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207096 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207098 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.207101 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208274 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208284 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208292 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208298 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208303 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208306 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:45:48.210355 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208311 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208319 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208323 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208327 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208330 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208334 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208337 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208340 2571 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208344 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208347 2571 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208350 2571 flags.go:64] FLAG: --cloud-config="" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208352 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208355 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208362 2571 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208365 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208368 2571 flags.go:64] FLAG: --config-dir="" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208371 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208374 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208378 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208381 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208385 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208388 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208391 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208394 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:45:48.210890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208397 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208400 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208403 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208407 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208410 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208413 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208417 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208420 2571 flags.go:64] FLAG: --enable-server="true" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208423 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208435 2571 flags.go:64] FLAG: --event-burst="100" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208438 2571 flags.go:64] FLAG: --event-qps="50" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208441 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208444 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208448 2571 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208456 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208459 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208462 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208465 2571 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208468 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208471 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208474 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208479 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208482 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208485 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208488 2571 flags.go:64] FLAG: --feature-gates="" Apr 22 18:45:48.211479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208491 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208494 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208498 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208501 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208504 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208507 2571 flags.go:64] FLAG: --help="false" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208511 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208514 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208517 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208520 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208523 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208527 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208530 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208545 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208548 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208551 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208554 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208557 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208560 2571 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208564 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208566 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208570 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208573 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208576 2571 flags.go:64] FLAG: --lock-file="" Apr 22 18:45:48.212105 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208578 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208581 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208584 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208589 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208593 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208596 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208599 2571 flags.go:64] FLAG: --logging-format="text" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208602 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208605 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208608 2571 flags.go:64] FLAG: --manifest-url="" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208611 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208616 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208619 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208623 2571 flags.go:64] FLAG: --max-pods="110" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208626 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208629 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208632 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208635 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208638 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208642 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208645 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208652 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208655 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208658 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:45:48.212704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208661 2571 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208664 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208670 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208673 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208677 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208680 2571 flags.go:64] FLAG: --port="10250" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208683 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208686 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a775edac7e9a3218" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208689 2571 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208692 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208695 2571 flags.go:64] FLAG: --register-node="true" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208698 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208702 2571 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208705 2571 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208708 2571 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208711 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208714 2571 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208718 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208721 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208724 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208727 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208729 2571 flags.go:64] FLAG: --runonce="false" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208732 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208735 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208738 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:45:48.213282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208741 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208744 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208747 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208750 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208754 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208757 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208760 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208762 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208765 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208768 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208772 2571 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208774 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208781 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208784 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208786 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208791 2571 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208793 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208796 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208800 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208804 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208807 2571 flags.go:64] FLAG: --v="2" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208812 2571 flags.go:64] FLAG: --version="false" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208816 2571 flags.go:64] FLAG: --vmodule="" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208820 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.208823 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:45:48.213896 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208917 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208920 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208924 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208926 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208929 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208932 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208934 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208937 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208940 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208942 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208945 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208949 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208952 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208955 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208958 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208961 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208963 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208966 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208968 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208971 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:45:48.214905 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208974 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208977 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208979 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208982 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208985 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208987 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208990 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208993 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208995 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.208998 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209000 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209003 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209005 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209008 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209010 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209015 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209018 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209021 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209024 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209026 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:45:48.215723 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209029 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209031 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209034 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209037 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209039 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209042 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209044 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209047 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209050 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209053 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209055 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209058 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209060 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209063 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209066 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209069 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209071 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209074 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209076 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209079 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:45:48.216352 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209082 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209084 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209087 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209089 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209092 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209094 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209097 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209101 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209105 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209107 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209110 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209113 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209115 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209118 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209120 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209123 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209126 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209128 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209131 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:45:48.217020 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209133 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:45:48.217855 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209136 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:45:48.217855 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209138 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:45:48.217855 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209141 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:45:48.217855 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209144 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:45:48.217855 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209146 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:45:48.217855 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.209149 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:45:48.217855 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.209157 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:45:48.218727 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.218705 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:45:48.218727 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.218727 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:45:48.218874 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218863 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218876 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218883 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218888 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218899 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218904 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218909 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218914 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218919 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218923 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:45:48.218925 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218928 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218933 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218937 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218942 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218946 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218951 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218960 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218964 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218969 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218973 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218978 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218983 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218988 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218992 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.218997 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219001 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219005 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219010 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219014 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:45:48.219356 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219023 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219027 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219031 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219036 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219041 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219045 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219052 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219059 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219065 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219069 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219074 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219078 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219088 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219093 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219098 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219103 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219108 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219114 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219119 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:45:48.220181 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219124 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219129 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219133 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219137 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219142 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219147 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219157 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219162 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219167 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219171 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219175 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219180 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219184 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219189 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219193 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219197 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219202 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219208 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219218 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219223 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:45:48.220792 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219227 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219232 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219235 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219240 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219244 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219249 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219253 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219257 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219263 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219267 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219272 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219281 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219286 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219291 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219295 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219300 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219304 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:45:48.221351 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219309 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.219317 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219679 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219690 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219695 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219700 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219705 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219709 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219714 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219719 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219723 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219728 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219732 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219742 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219746 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:45:48.221786 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219751 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219755 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219759 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219763 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219768 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219772 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219776 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219781 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219785 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219790 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219799 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219805 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219818 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219824 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219829 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219834 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219839 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219843 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219848 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:45:48.222186 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219852 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219858 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219863 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219867 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219877 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219884 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219889 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219893 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219897 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219902 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219906 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219911 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219915 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219920 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219925 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219929 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219938 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219943 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219947 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:45:48.222861 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219952 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219956 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219960 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219965 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219970 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219974 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219978 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219983 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219987 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.219991 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220000 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220005 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220009 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220013 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220017 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220022 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220027 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220031 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220036 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220040 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:45:48.223654 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220044 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220049 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220057 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220061 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220066 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220070 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220075 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220079 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220084 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220088 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220092 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220097 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220101 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220105 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:48.220109 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.220123 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:45:48.224146 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.220986 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:45:48.225850 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.225833 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:45:48.227036 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.227022 2571 server.go:1019] "Starting client certificate rotation" Apr 22 18:45:48.227129 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.227116 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:45:48.227170 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.227149 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:45:48.249099 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.249080 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:45:48.251923 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.251904 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:45:48.265573 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.265548 2571 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:45:48.270916 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.270899 2571 log.go:25] "Validated CRI v1 image API" Apr 22 18:45:48.272174 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.272156 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:45:48.276630 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.276609 2571 fs.go:135] Filesystem UUIDs: map[0e62c224-78d9-4f49-aab4-37bebcd6206b:/dev/nvme0n1p4 7119947a-543f-496f-81ff-96b1c775ddd9:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:45:48.276705 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.276628 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:45:48.282414 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.282293 2571 manager.go:217] Machine: {Timestamp:2026-04-22 18:45:48.280438762 +0000 UTC m=+0.378788693 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101392 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2562136582812f707fa937791d2646 SystemUUID:ec256213-6582-812f-707f-a937791d2646 BootID:5bd72a0f-ad17-4dd7-bcb4-5a9e4ab23b68 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0a:f4:d9:22:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0a:f4:d9:22:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:0d:19:c9:5a:b0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:45:48.282414 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.282409 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:45:48.282549 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.282492 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:45:48.283476 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.283453 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:45:48.283636 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.283479 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-42.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:45:48.283680 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.283645 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:45:48.283680 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.283655 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:45:48.283680 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.283668 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:45:48.284326 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.284317 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:45:48.285066 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.285052 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:45:48.285864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.285853 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:45:48.285984 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.285974 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:45:48.288632 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.288620 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:45:48.288696 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.288635 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:45:48.288696 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.288652 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:45:48.288696 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.288663 2571 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:45:48.288696 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.288671 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:45:48.289747 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.289731 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:45:48.289811 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.289758 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:45:48.294297 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.294275 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:45:48.295774 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.295755 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:45:48.297332 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297315 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:45:48.297332 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297336 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297344 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297351 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297357 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297362 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297369 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297374 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297382 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297388 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297396 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:45:48.297453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.297404 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:45:48.298175 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.298164 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:45:48.298214 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.298178 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:45:48.300799 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.300783 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-42.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:45:48.301488 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.301461 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-42.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:45:48.301585 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.301461 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:45:48.302056 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.302040 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:45:48.302056 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.302075 2571 server.go:1295] "Started kubelet" Apr 22 18:45:48.302202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.302172 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:45:48.302202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.302164 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:45:48.302264 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.302232 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:45:48.302917 ip-10-0-133-42 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:45:48.303354 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.303322 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:45:48.304585 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.304574 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:45:48.308817 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.308785 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:45:48.309208 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.309188 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:45:48.309817 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.309797 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:45:48.309817 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.309800 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:45:48.309961 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.309829 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:45:48.310661 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.310143 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:48.310661 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310296 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:45:48.310661 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310308 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:45:48.310661 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310356 2571 factory.go:55] Registering systemd factory Apr 22 18:45:48.310661 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310376 2571 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:45:48.310932 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310665 2571 factory.go:153] Registering CRI-O factory Apr 22 18:45:48.310932 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310681 2571 factory.go:223] Registration of the crio container factory successfully Apr 22 18:45:48.310932 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310738 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:45:48.310932 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310764 2571 factory.go:103] Registering Raw factory Apr 22 18:45:48.310932 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.310779 2571 manager.go:1196] Started watching for new ooms in manager Apr 22 18:45:48.311198 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.311182 2571 manager.go:319] Starting recovery of all containers Apr 22 18:45:48.318205 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.317975 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-42.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:45:48.318524 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.318498 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:45:48.319504 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.319448 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:45:48.324215 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.324197 2571 manager.go:324] Recovery completed Apr 22 18:45:48.325145 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.321367 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-42.ec2.internal.18a8c22c3fc789ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-42.ec2.internal,UID:ip-10-0-133-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-42.ec2.internal,},FirstTimestamp:2026-04-22 18:45:48.302051757 +0000 UTC m=+0.400401691,LastTimestamp:2026-04-22 18:45:48.302051757 +0000 UTC m=+0.400401691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-42.ec2.internal,}" Apr 22 18:45:48.327845 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.327823 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gqbp6" Apr 22 18:45:48.329127 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.329110 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:45:48.331557 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.331528 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:45:48.331625 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.331569 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:45:48.331625 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.331579 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:45:48.332059 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.332046 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:45:48.332059 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.332057 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:45:48.332149 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.332072 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:45:48.333454 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.333396 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-42.ec2.internal.18a8c22c4189bcb0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-42.ec2.internal,UID:ip-10-0-133-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-42.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-42.ec2.internal,},FirstTimestamp:2026-04-22 18:45:48.331556016 +0000 UTC m=+0.429905947,LastTimestamp:2026-04-22 18:45:48.331556016 +0000 UTC m=+0.429905947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-42.ec2.internal,}" Apr 22 18:45:48.335252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.335239 2571 policy_none.go:49] "None policy: Start" Apr 22 18:45:48.335313 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.335255 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:45:48.335313 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.335265 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:45:48.340845 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.340828 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gqbp6" Apr 22 18:45:48.342474 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.342405 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-42.ec2.internal.18a8c22c418a033c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-42.ec2.internal,UID:ip-10-0-133-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-133-42.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-133-42.ec2.internal,},FirstTimestamp:2026-04-22 18:45:48.331574076 +0000 UTC m=+0.429924007,LastTimestamp:2026-04-22 18:45:48.331574076 +0000 UTC m=+0.429924007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-42.ec2.internal,}" Apr 22 18:45:48.380117 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.380099 2571 manager.go:341] "Starting Device Plugin manager" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.380134 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.380145 2571 server.go:85] "Starting device plugin registration server" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.380388 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.380399 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.380490 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.380588 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.380600 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.381106 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:45:48.403514 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.381141 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:48.422203 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.422171 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:45:48.422203 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.422207 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:45:48.422329 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.422225 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:45:48.422329 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.422232 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:45:48.422329 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.422266 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:45:48.428650 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.428627 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:45:48.480575 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.480486 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:45:48.481645 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.481629 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:45:48.481728 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.481665 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:45:48.481728 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.481677 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:45:48.481728 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.481700 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.492459 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.492439 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.492508 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.492463 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-42.ec2.internal\": node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:48.509360 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.509329 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:48.523112 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.523090 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal"] Apr 22 18:45:48.523173 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.523157 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:45:48.523995 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.523981 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:45:48.524057 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.524009 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:45:48.524057 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.524019 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:45:48.526226 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526213 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:45:48.526380 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526366 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.526430 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526396 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:45:48.526934 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526917 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:45:48.527014 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526923 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:45:48.527014 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526972 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:45:48.527014 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526988 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:45:48.527014 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.526948 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:45:48.527014 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.527011 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:45:48.529120 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.529106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.529173 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.529129 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:45:48.529769 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.529756 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:45:48.529851 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.529777 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:45:48.529851 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.529790 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:45:48.552203 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.552183 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-42.ec2.internal\" not found" node="ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.556622 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.556598 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-42.ec2.internal\" not found" node="ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.609621 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.609595 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:48.611818 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.611802 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d02647afefe84c2445ae8b27f0b7998e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal\" (UID: \"d02647afefe84c2445ae8b27f0b7998e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.611875 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.611831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d02647afefe84c2445ae8b27f0b7998e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal\" (UID: \"d02647afefe84c2445ae8b27f0b7998e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.611875 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.611849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/380fabb8e9eee8f3530eb1504d622a92-config\") pod \"kube-apiserver-proxy-ip-10-0-133-42.ec2.internal\" (UID: \"380fabb8e9eee8f3530eb1504d622a92\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.710000 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.709971 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:48.712161 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.712143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d02647afefe84c2445ae8b27f0b7998e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal\" (UID: \"d02647afefe84c2445ae8b27f0b7998e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.712203 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.712171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d02647afefe84c2445ae8b27f0b7998e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal\" (UID: \"d02647afefe84c2445ae8b27f0b7998e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.712238 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.712197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/380fabb8e9eee8f3530eb1504d622a92-config\") pod \"kube-apiserver-proxy-ip-10-0-133-42.ec2.internal\" (UID: \"380fabb8e9eee8f3530eb1504d622a92\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.712267 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.712238 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d02647afefe84c2445ae8b27f0b7998e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal\" (UID: \"d02647afefe84c2445ae8b27f0b7998e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.712267 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.712251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d02647afefe84c2445ae8b27f0b7998e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal\" (UID: \"d02647afefe84c2445ae8b27f0b7998e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.712325 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.712302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/380fabb8e9eee8f3530eb1504d622a92-config\") pod \"kube-apiserver-proxy-ip-10-0-133-42.ec2.internal\" (UID: \"380fabb8e9eee8f3530eb1504d622a92\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.810746 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.810678 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:48.854426 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.854397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.859030 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:48.859014 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" Apr 22 18:45:48.910775 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:48.910738 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:49.011203 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.011167 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:49.111730 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.111648 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:49.212167 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.212139 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-42.ec2.internal\" not found" Apr 22 18:45:49.227438 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.227419 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:45:49.227569 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.227554 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:45:49.279971 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.279944 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:45:49.288968 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.288948 2571 apiserver.go:52] "Watching apiserver" Apr 22 18:45:49.299408 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.299384 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:45:49.300145 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.300117 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qzzp6","openshift-image-registry/node-ca-rvx2v","openshift-multus/multus-additional-cni-plugins-brbjp","openshift-multus/multus-hsb2f","openshift-ovn-kubernetes/ovnkube-node-vmbf4","kube-system/konnectivity-agent-2s28b","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q","openshift-multus/network-metrics-daemon-v6b2n","openshift-network-diagnostics/network-check-target-bvpqn","openshift-network-operator/iptables-alerter-gvq84","openshift-cluster-node-tuning-operator/tuned-4wgsq"] Apr 22 18:45:49.303104 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.303085 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.305497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.305480 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.305679 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.305662 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:45:49.305736 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.305725 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sf2bc\"" Apr 22 18:45:49.305938 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.305924 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:45:49.308052 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.307892 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:45:49.308052 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.307979 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:45:49.308052 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.307994 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:45:49.308419 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.308322 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zxb2x\"" Apr 22 18:45:49.308969 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.308924 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:45:49.309095 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.309074 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.309471 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.309458 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" Apr 22 18:45:49.311739 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.311722 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.311832 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.311779 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:45:49.311915 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.311867 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:45:49.311958 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.311916 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:45:49.312076 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.312061 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6vt7\"" Apr 22 18:45:49.312138 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.312078 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:45:49.312138 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.312097 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:45:49.314483 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.314465 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.314598 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.314486 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:45:49.314924 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.314901 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-k7lhr\"" Apr 22 18:45:49.315424 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315408 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tqk\" (UniqueName: \"kubernetes.io/projected/e63b9430-ddb4-4626-8019-7bfa90ffac77-kube-api-access-c2tqk\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.315474 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315433 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.315474 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315451 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sftj\" (UniqueName: \"kubernetes.io/projected/22922279-9d57-4b39-9e9b-25a133f37c1b-kube-api-access-8sftj\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.315584 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-socket-dir-parent\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.315584 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c75569b-ad2e-4296-8dad-807a8c913df1-hosts-file\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.315584 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prk4\" (UniqueName: \"kubernetes.io/projected/0c75569b-ad2e-4296-8dad-807a8c913df1-kube-api-access-6prk4\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.315709 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-cnibin\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.315709 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-k8s-cni-cncf-io\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.315709 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-cni-bin\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.315709 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-os-release\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.315709 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c75569b-ad2e-4296-8dad-807a8c913df1-tmp-dir\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e63b9430-ddb4-4626-8019-7bfa90ffac77-host\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-cni-binary-copy\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-system-cni-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-cnibin\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-multus-certs\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e63b9430-ddb4-4626-8019-7bfa90ffac77-serviceca\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315910 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-os-release\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.315943 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-daemon-config\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.315993 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-cni-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316018 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-etc-kubernetes\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-netns\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-cni-multus\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-kubelet\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316084 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-system-cni-dir\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-hostroot\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316162 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-conf-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316176 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ft2p\" (UniqueName: \"kubernetes.io/projected/4867ac2c-3d1c-44a3-b5d4-495f207482ed-kube-api-access-2ft2p\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316252 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316201 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4867ac2c-3d1c-44a3-b5d4-495f207482ed-cni-binary-copy\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.316967 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316938 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:45:49.316967 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.316962 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:45:49.317119 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.317045 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:45:49.317119 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.317076 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:45:49.317317 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.317301 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.317455 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.317439 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:45:49.317616 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.317598 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:45:49.317616 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.317607 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jtxgh\"" Apr 22 18:45:49.318952 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.318936 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:45:49.319045 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.318989 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" Apr 22 18:45:49.319812 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.319794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.320160 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.320142 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:45:49.320325 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.320309 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fs825\"" Apr 22 18:45:49.320446 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.320432 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:45:49.322447 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.322431 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:49.322622 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.322603 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:45:49.322716 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.322666 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:45:49.322716 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.322675 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:45:49.322716 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.322668 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:45:49.322907 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.322668 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:45:49.322907 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.322759 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fbpqc\"" Apr 22 18:45:49.325099 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.325084 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:49.325169 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.325143 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:45:49.327781 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.327766 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.330229 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.330211 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:45:49.330342 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.330265 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qgzpj\"" Apr 22 18:45:49.330342 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.330291 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:49.330342 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.330312 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:45:49.331055 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.331037 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal"] Apr 22 18:45:49.331135 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.331122 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.331830 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.331815 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:45:49.331999 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.331986 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal"] Apr 22 18:45:49.333402 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.333383 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:49.333517 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.333437 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:45:49.333517 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.333454 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rbl2b\"" Apr 22 18:45:49.342777 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.342745 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:40:48 +0000 UTC" deadline="2027-11-04 09:51:51.737612523 +0000 UTC" Apr 22 18:45:49.342777 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.342774 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13455h6m2.394840468s" Apr 22 18:45:49.349717 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.349695 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-d28g4" Apr 22 18:45:49.357033 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.357014 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-d28g4" Apr 22 18:45:49.411219 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.411197 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:45:49.416981 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.416962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-cni-multus\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.417076 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.416991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qgl\" (UniqueName: \"kubernetes.io/projected/72131875-3a6a-454e-a845-bdca533f20de-kube-api-access-q4qgl\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417076 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417010 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-996gd\" (UniqueName: \"kubernetes.io/projected/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-kube-api-access-996gd\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:49.417076 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.417076 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ft2p\" (UniqueName: \"kubernetes.io/projected/4867ac2c-3d1c-44a3-b5d4-495f207482ed-kube-api-access-2ft2p\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.417244 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417077 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-var-lib-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.417244 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417102 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-ovnkube-config\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.417244 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-cni-multus\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.417244 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtq4\" (UniqueName: \"kubernetes.io/projected/64e165b4-deda-4d6f-8f70-c28ac7cebec4-kube-api-access-kjtq4\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.417244 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysconfig\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417244 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-sys\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4867ac2c-3d1c-44a3-b5d4-495f207482ed-cni-binary-copy\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417277 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417302 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97c5443c-b607-45ba-8245-88b3b1af7d19-ovn-node-metrics-cert\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417327 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/91da0166-33f7-46f2-9824-bf2339c00a28-konnectivity-ca\") pod \"konnectivity-agent-2s28b\" (UID: \"91da0166-33f7-46f2-9824-bf2339c00a28\") " pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72131875-3a6a-454e-a845-bdca533f20de-etc-tuned\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417393 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64e165b4-deda-4d6f-8f70-c28ac7cebec4-iptables-alerter-script\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-modprobe-d\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417471 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysctl-d\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417497 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417495 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-etc-selinux\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c75569b-ad2e-4296-8dad-807a8c913df1-hosts-file\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6prk4\" (UniqueName: \"kubernetes.io/projected/0c75569b-ad2e-4296-8dad-807a8c913df1-kube-api-access-6prk4\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c75569b-ad2e-4296-8dad-807a8c913df1-hosts-file\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417708 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4867ac2c-3d1c-44a3-b5d4-495f207482ed-cni-binary-copy\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-lib-modules\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72131875-3a6a-454e-a845-bdca533f20de-tmp\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-socket-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.417892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-registration-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417919 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhm7\" (UniqueName: \"kubernetes.io/projected/97c5443c-b607-45ba-8245-88b3b1af7d19-kube-api-access-cqhm7\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417962 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/91da0166-33f7-46f2-9824-bf2339c00a28-agent-certs\") pod \"konnectivity-agent-2s28b\" (UID: \"91da0166-33f7-46f2-9824-bf2339c00a28\") " pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.417990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-os-release\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c75569b-ad2e-4296-8dad-807a8c913df1-tmp-dir\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e63b9430-ddb4-4626-8019-7bfa90ffac77-host\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418111 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-os-release\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-system-cni-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418173 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e63b9430-ddb4-4626-8019-7bfa90ffac77-host\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-system-cni-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-systemd\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c75569b-ad2e-4296-8dad-807a8c913df1-tmp-dir\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e63b9430-ddb4-4626-8019-7bfa90ffac77-serviceca\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-os-release\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.418333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418326 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-cni-netd\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418355 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64e165b4-deda-4d6f-8f70-c28ac7cebec4-host-slash\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418357 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-os-release\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418382 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysctl-conf\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-cni-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-etc-kubernetes\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-run-netns\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-ovn\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-sys-fs\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-etc-kubernetes\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-netns\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418590 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-cni-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-slash\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418618 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-netns\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-device-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418654 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e63b9430-ddb4-4626-8019-7bfa90ffac77-serviceca\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-kubelet\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.418904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-kubelet\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418710 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-system-cni-dir\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-hostroot\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-conf-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-hostroot\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-system-cni-dir\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-systemd-units\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-conf-dir\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418830 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-run\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418860 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-node-log\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-var-lib-kubelet\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tqk\" (UniqueName: \"kubernetes.io/projected/e63b9430-ddb4-4626-8019-7bfa90ffac77-kube-api-access-c2tqk\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.418980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sftj\" (UniqueName: \"kubernetes.io/projected/22922279-9d57-4b39-9e9b-25a133f37c1b-kube-api-access-8sftj\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-socket-dir-parent\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-log-socket\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419071 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.419634 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-socket-dir-parent\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-kubernetes\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419198 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-cnibin\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-k8s-cni-cncf-io\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22922279-9d57-4b39-9e9b-25a133f37c1b-cnibin\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-cni-bin\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-k8s-cni-cncf-io\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-etc-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-var-lib-cni-bin\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-run-ovn-kubernetes\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-ovnkube-script-lib\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-cni-binary-copy\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-cnibin\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419599 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-cnibin\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-multus-certs\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-systemd\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.420393 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrlh\" (UniqueName: \"kubernetes.io/projected/36758514-4d64-4551-b091-9b23e243572e-kube-api-access-czrlh\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419753 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4867ac2c-3d1c-44a3-b5d4-495f207482ed-host-run-multus-certs\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-daemon-config\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-env-overrides\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.419926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-kubelet\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.420160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-cni-bin\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.420527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-cni-binary-copy\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.420657 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-host\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.420697 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.420729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/22922279-9d57-4b39-9e9b-25a133f37c1b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.421131 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.420859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4867ac2c-3d1c-44a3-b5d4-495f207482ed-multus-daemon-config\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.424582 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.424280 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:45:49.427098 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.427079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sftj\" (UniqueName: \"kubernetes.io/projected/22922279-9d57-4b39-9e9b-25a133f37c1b-kube-api-access-8sftj\") pod \"multus-additional-cni-plugins-brbjp\" (UID: \"22922279-9d57-4b39-9e9b-25a133f37c1b\") " pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.427188 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.427137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ft2p\" (UniqueName: \"kubernetes.io/projected/4867ac2c-3d1c-44a3-b5d4-495f207482ed-kube-api-access-2ft2p\") pod \"multus-hsb2f\" (UID: \"4867ac2c-3d1c-44a3-b5d4-495f207482ed\") " pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.427188 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.427143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prk4\" (UniqueName: \"kubernetes.io/projected/0c75569b-ad2e-4296-8dad-807a8c913df1-kube-api-access-6prk4\") pod \"node-resolver-qzzp6\" (UID: \"0c75569b-ad2e-4296-8dad-807a8c913df1\") " pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.427188 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.427080 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tqk\" (UniqueName: \"kubernetes.io/projected/e63b9430-ddb4-4626-8019-7bfa90ffac77-kube-api-access-c2tqk\") pod \"node-ca-rvx2v\" (UID: \"e63b9430-ddb4-4626-8019-7bfa90ffac77\") " pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.470233 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.470193 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd02647afefe84c2445ae8b27f0b7998e.slice/crio-96f759c5ecfd4eacd6e7fe9cca0b4df0aaac9ca680773c47dab5523aa90f233e WatchSource:0}: Error finding container 96f759c5ecfd4eacd6e7fe9cca0b4df0aaac9ca680773c47dab5523aa90f233e: Status 404 returned error can't find the container with id 96f759c5ecfd4eacd6e7fe9cca0b4df0aaac9ca680773c47dab5523aa90f233e Apr 22 18:45:49.472498 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.472474 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380fabb8e9eee8f3530eb1504d622a92.slice/crio-11f6a3eee17d6fdb15f70229cbd9b3574b98558372c61959fb99c14975997aef WatchSource:0}: Error finding container 11f6a3eee17d6fdb15f70229cbd9b3574b98558372c61959fb99c14975997aef: Status 404 returned error can't find the container with id 11f6a3eee17d6fdb15f70229cbd9b3574b98558372c61959fb99c14975997aef Apr 22 18:45:49.474870 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.474850 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:45:49.521944 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.521916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-systemd-units\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.521944 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.521942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-run\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.521959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-node-log\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.521974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-var-lib-kubelet\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.521997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-log-socket\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-kubernetes\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-systemd-units\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-var-lib-kubelet\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522059 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-run\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522071 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-log-socket\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-node-log\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522067 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-kubernetes\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-etc-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-run-ovn-kubernetes\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-etc-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-run-ovn-kubernetes\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-ovnkube-script-lib\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-systemd\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czrlh\" (UniqueName: \"kubernetes.io/projected/36758514-4d64-4551-b091-9b23e243572e-kube-api-access-czrlh\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-env-overrides\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-kubelet\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-cni-bin\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-systemd\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-host\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522349 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522364 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-kubelet\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qgl\" (UniqueName: \"kubernetes.io/projected/72131875-3a6a-454e-a845-bdca533f20de-kube-api-access-q4qgl\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-host\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-cni-bin\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.522710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522410 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-996gd\" (UniqueName: \"kubernetes.io/projected/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-kube-api-access-996gd\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-var-lib-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-ovnkube-config\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtq4\" (UniqueName: \"kubernetes.io/projected/64e165b4-deda-4d6f-8f70-c28ac7cebec4-kube-api-access-kjtq4\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysconfig\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-sys\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-var-lib-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysconfig\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-sys\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522596 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-openvswitch\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97c5443c-b607-45ba-8245-88b3b1af7d19-ovn-node-metrics-cert\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/91da0166-33f7-46f2-9824-bf2339c00a28-konnectivity-ca\") pod \"konnectivity-agent-2s28b\" (UID: \"91da0166-33f7-46f2-9824-bf2339c00a28\") " pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72131875-3a6a-454e-a845-bdca533f20de-etc-tuned\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64e165b4-deda-4d6f-8f70-c28ac7cebec4-iptables-alerter-script\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-ovnkube-script-lib\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.523453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-modprobe-d\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysctl-d\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-etc-selinux\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.522885 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-modprobe-d\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-lib-modules\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72131875-3a6a-454e-a845-bdca533f20de-tmp\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.522955 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-etc-selinux\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.522976 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs podName:e0d458b0-40cd-4eaf-8dbf-220566ae55ef nodeName:}" failed. No retries permitted until 2026-04-22 18:45:50.02294496 +0000 UTC m=+2.121294879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs") pod "network-metrics-daemon-v6b2n" (UID: "e0d458b0-40cd-4eaf-8dbf-220566ae55ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-socket-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysctl-d\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-registration-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523082 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-registration-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhm7\" (UniqueName: \"kubernetes.io/projected/97c5443c-b607-45ba-8245-88b3b1af7d19-kube-api-access-cqhm7\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/91da0166-33f7-46f2-9824-bf2339c00a28-agent-certs\") pod \"konnectivity-agent-2s28b\" (UID: \"91da0166-33f7-46f2-9824-bf2339c00a28\") " pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523180 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-systemd\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524202 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-cni-netd\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64e165b4-deda-4d6f-8f70-c28ac7cebec4-host-slash\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysctl-conf\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-run-netns\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-ovn\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-socket-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-cni-netd\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-sys-fs\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-slash\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523401 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-systemd\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-env-overrides\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-device-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/91da0166-33f7-46f2-9824-bf2339c00a28-konnectivity-ca\") pod \"konnectivity-agent-2s28b\" (UID: \"91da0166-33f7-46f2-9824-bf2339c00a28\") " pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64e165b4-deda-4d6f-8f70-c28ac7cebec4-host-slash\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-run-ovn\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523499 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-sys-fs\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.524732 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-lib-modules\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.525178 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/36758514-4d64-4551-b091-9b23e243572e-device-dir\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.525178 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523606 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-run-netns\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.525178 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72131875-3a6a-454e-a845-bdca533f20de-etc-sysctl-conf\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.525178 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.523643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97c5443c-b607-45ba-8245-88b3b1af7d19-host-slash\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.525178 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.524067 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64e165b4-deda-4d6f-8f70-c28ac7cebec4-iptables-alerter-script\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.525178 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.524119 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97c5443c-b607-45ba-8245-88b3b1af7d19-ovnkube-config\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.525376 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.525360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72131875-3a6a-454e-a845-bdca533f20de-etc-tuned\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.525493 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.525468 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97c5443c-b607-45ba-8245-88b3b1af7d19-ovn-node-metrics-cert\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.525719 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.525701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72131875-3a6a-454e-a845-bdca533f20de-tmp\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.525807 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.525791 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/91da0166-33f7-46f2-9824-bf2339c00a28-agent-certs\") pod \"konnectivity-agent-2s28b\" (UID: \"91da0166-33f7-46f2-9824-bf2339c00a28\") " pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.530856 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.530837 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:45:49.530950 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.530859 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:45:49.530950 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.530871 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ljlj7 for pod openshift-network-diagnostics/network-check-target-bvpqn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:49.530950 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:49.530945 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7 podName:fb5bea5b-4447-44e8-8573-662eda69835e nodeName:}" failed. No retries permitted until 2026-04-22 18:45:50.03092495 +0000 UTC m=+2.129274885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ljlj7" (UniqueName: "kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7") pod "network-check-target-bvpqn" (UID: "fb5bea5b-4447-44e8-8573-662eda69835e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:49.531710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.531684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhm7\" (UniqueName: \"kubernetes.io/projected/97c5443c-b607-45ba-8245-88b3b1af7d19-kube-api-access-cqhm7\") pod \"ovnkube-node-vmbf4\" (UID: \"97c5443c-b607-45ba-8245-88b3b1af7d19\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.532322 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.532301 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-996gd\" (UniqueName: \"kubernetes.io/projected/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-kube-api-access-996gd\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:49.532704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.532688 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qgl\" (UniqueName: \"kubernetes.io/projected/72131875-3a6a-454e-a845-bdca533f20de-kube-api-access-q4qgl\") pod \"tuned-4wgsq\" (UID: \"72131875-3a6a-454e-a845-bdca533f20de\") " pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.533065 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.533037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrlh\" (UniqueName: \"kubernetes.io/projected/36758514-4d64-4551-b091-9b23e243572e-kube-api-access-czrlh\") pod \"aws-ebs-csi-driver-node-27l6q\" (UID: \"36758514-4d64-4551-b091-9b23e243572e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.533065 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.533052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtq4\" (UniqueName: \"kubernetes.io/projected/64e165b4-deda-4d6f-8f70-c28ac7cebec4-kube-api-access-kjtq4\") pod \"iptables-alerter-gvq84\" (UID: \"64e165b4-deda-4d6f-8f70-c28ac7cebec4\") " pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.565347 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.565320 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:45:49.630869 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.630798 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzzp6" Apr 22 18:45:49.636574 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.636550 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c75569b_ad2e_4296_8dad_807a8c913df1.slice/crio-9bcf2d20048c257af2fea7e17e7abbc78fc4ebff856d4ea30093dcea591cf777 WatchSource:0}: Error finding container 9bcf2d20048c257af2fea7e17e7abbc78fc4ebff856d4ea30093dcea591cf777: Status 404 returned error can't find the container with id 9bcf2d20048c257af2fea7e17e7abbc78fc4ebff856d4ea30093dcea591cf777 Apr 22 18:45:49.640508 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.640487 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rvx2v" Apr 22 18:45:49.646437 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.646414 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode63b9430_ddb4_4626_8019_7bfa90ffac77.slice/crio-3aa3f11627def5f1b1895521aca75318366656d9bb1b2007bc4fa311453b5e85 WatchSource:0}: Error finding container 3aa3f11627def5f1b1895521aca75318366656d9bb1b2007bc4fa311453b5e85: Status 404 returned error can't find the container with id 3aa3f11627def5f1b1895521aca75318366656d9bb1b2007bc4fa311453b5e85 Apr 22 18:45:49.660096 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.660071 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brbjp" Apr 22 18:45:49.665739 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.665714 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22922279_9d57_4b39_9e9b_25a133f37c1b.slice/crio-e90da11d8589f8f4a419054f9e59e96ea7a919d180eef973735d146bab528769 WatchSource:0}: Error finding container e90da11d8589f8f4a419054f9e59e96ea7a919d180eef973735d146bab528769: Status 404 returned error can't find the container with id e90da11d8589f8f4a419054f9e59e96ea7a919d180eef973735d146bab528769 Apr 22 18:45:49.675826 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.675809 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hsb2f" Apr 22 18:45:49.682097 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.682078 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4867ac2c_3d1c_44a3_b5d4_495f207482ed.slice/crio-2717a68a5a5e74e8496420ebc22175aa3b8125d47f223f0fbb69f015228853b7 WatchSource:0}: Error finding container 2717a68a5a5e74e8496420ebc22175aa3b8125d47f223f0fbb69f015228853b7: Status 404 returned error can't find the container with id 2717a68a5a5e74e8496420ebc22175aa3b8125d47f223f0fbb69f015228853b7 Apr 22 18:45:49.691092 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.691077 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:45:49.696277 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.696254 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c5443c_b607_45ba_8245_88b3b1af7d19.slice/crio-cc7136ae78a408b2282ef5d5a912eac344e8cb5f15e5876183c7ebe04b6b0d39 WatchSource:0}: Error finding container cc7136ae78a408b2282ef5d5a912eac344e8cb5f15e5876183c7ebe04b6b0d39: Status 404 returned error can't find the container with id cc7136ae78a408b2282ef5d5a912eac344e8cb5f15e5876183c7ebe04b6b0d39 Apr 22 18:45:49.704264 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.704246 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:45:49.710692 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.710671 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91da0166_33f7_46f2_9824_bf2339c00a28.slice/crio-c124153999ec55d00c4dad6349032d4c2d375173012a7c2ac79c7bcb4eb38c63 WatchSource:0}: Error finding container c124153999ec55d00c4dad6349032d4c2d375173012a7c2ac79c7bcb4eb38c63: Status 404 returned error can't find the container with id c124153999ec55d00c4dad6349032d4c2d375173012a7c2ac79c7bcb4eb38c63 Apr 22 18:45:49.717600 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.717585 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" Apr 22 18:45:49.723298 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.723278 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36758514_4d64_4551_b091_9b23e243572e.slice/crio-8f630f16aa9a8c8c90abced6415ebad2e4b9a75c2187ca92113d98f4eac3e580 WatchSource:0}: Error finding container 8f630f16aa9a8c8c90abced6415ebad2e4b9a75c2187ca92113d98f4eac3e580: Status 404 returned error can't find the container with id 8f630f16aa9a8c8c90abced6415ebad2e4b9a75c2187ca92113d98f4eac3e580 Apr 22 18:45:49.737416 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.737397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gvq84" Apr 22 18:45:49.742731 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.742703 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" Apr 22 18:45:49.742930 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.742909 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64e165b4_deda_4d6f_8f70_c28ac7cebec4.slice/crio-559bbc11fa54725bc778a2a1d39bcf542aefe597af0a5a0cfe030cc2dbcea9ec WatchSource:0}: Error finding container 559bbc11fa54725bc778a2a1d39bcf542aefe597af0a5a0cfe030cc2dbcea9ec: Status 404 returned error can't find the container with id 559bbc11fa54725bc778a2a1d39bcf542aefe597af0a5a0cfe030cc2dbcea9ec Apr 22 18:45:49.749273 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:45:49.749250 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72131875_3a6a_454e_a845_bdca533f20de.slice/crio-3082d93c4c5e06ebb761b16cd42bb2488655a570ef4ff5dbe229262b8482cfc8 WatchSource:0}: Error finding container 3082d93c4c5e06ebb761b16cd42bb2488655a570ef4ff5dbe229262b8482cfc8: Status 404 returned error can't find the container with id 3082d93c4c5e06ebb761b16cd42bb2488655a570ef4ff5dbe229262b8482cfc8 Apr 22 18:45:49.883509 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:49.883400 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:45:50.026620 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.026590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:50.026796 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:50.026754 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:50.026856 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:50.026843 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs podName:e0d458b0-40cd-4eaf-8dbf-220566ae55ef nodeName:}" failed. No retries permitted until 2026-04-22 18:45:51.026822266 +0000 UTC m=+3.125172205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs") pod "network-metrics-daemon-v6b2n" (UID: "e0d458b0-40cd-4eaf-8dbf-220566ae55ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:50.127375 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.127339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:50.127553 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:50.127514 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:45:50.127629 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:50.127582 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:45:50.127629 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:50.127599 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ljlj7 for pod openshift-network-diagnostics/network-check-target-bvpqn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:50.127739 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:50.127668 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7 podName:fb5bea5b-4447-44e8-8573-662eda69835e nodeName:}" failed. No retries permitted until 2026-04-22 18:45:51.127639236 +0000 UTC m=+3.225989154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljlj7" (UniqueName: "kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7") pod "network-check-target-bvpqn" (UID: "fb5bea5b-4447-44e8-8573-662eda69835e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:50.165780 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.165709 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:45:50.358607 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.358562 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:40:49 +0000 UTC" deadline="2027-09-25 06:36:54.166841037 +0000 UTC" Apr 22 18:45:50.358607 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.358604 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12491h51m3.808240695s" Apr 22 18:45:50.463219 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.463104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2s28b" event={"ID":"91da0166-33f7-46f2-9824-bf2339c00a28","Type":"ContainerStarted","Data":"c124153999ec55d00c4dad6349032d4c2d375173012a7c2ac79c7bcb4eb38c63"} Apr 22 18:45:50.492928 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.492844 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"cc7136ae78a408b2282ef5d5a912eac344e8cb5f15e5876183c7ebe04b6b0d39"} Apr 22 18:45:50.520468 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.520430 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzzp6" event={"ID":"0c75569b-ad2e-4296-8dad-807a8c913df1","Type":"ContainerStarted","Data":"9bcf2d20048c257af2fea7e17e7abbc78fc4ebff856d4ea30093dcea591cf777"} Apr 22 18:45:50.534696 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.534402 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" event={"ID":"72131875-3a6a-454e-a845-bdca533f20de","Type":"ContainerStarted","Data":"3082d93c4c5e06ebb761b16cd42bb2488655a570ef4ff5dbe229262b8482cfc8"} Apr 22 18:45:50.550019 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.549942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gvq84" event={"ID":"64e165b4-deda-4d6f-8f70-c28ac7cebec4","Type":"ContainerStarted","Data":"559bbc11fa54725bc778a2a1d39bcf542aefe597af0a5a0cfe030cc2dbcea9ec"} Apr 22 18:45:50.563960 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.563890 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" event={"ID":"36758514-4d64-4551-b091-9b23e243572e","Type":"ContainerStarted","Data":"8f630f16aa9a8c8c90abced6415ebad2e4b9a75c2187ca92113d98f4eac3e580"} Apr 22 18:45:50.571356 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.571294 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsb2f" event={"ID":"4867ac2c-3d1c-44a3-b5d4-495f207482ed","Type":"ContainerStarted","Data":"2717a68a5a5e74e8496420ebc22175aa3b8125d47f223f0fbb69f015228853b7"} Apr 22 18:45:50.580835 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.580751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerStarted","Data":"e90da11d8589f8f4a419054f9e59e96ea7a919d180eef973735d146bab528769"} Apr 22 18:45:50.586282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.586209 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rvx2v" event={"ID":"e63b9430-ddb4-4626-8019-7bfa90ffac77","Type":"ContainerStarted","Data":"3aa3f11627def5f1b1895521aca75318366656d9bb1b2007bc4fa311453b5e85"} Apr 22 18:45:50.587904 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.587837 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" event={"ID":"380fabb8e9eee8f3530eb1504d622a92","Type":"ContainerStarted","Data":"11f6a3eee17d6fdb15f70229cbd9b3574b98558372c61959fb99c14975997aef"} Apr 22 18:45:50.592835 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:50.592774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" event={"ID":"d02647afefe84c2445ae8b27f0b7998e","Type":"ContainerStarted","Data":"96f759c5ecfd4eacd6e7fe9cca0b4df0aaac9ca680773c47dab5523aa90f233e"} Apr 22 18:45:51.038242 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:51.037626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:51.038242 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.037782 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:51.038242 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.037847 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs podName:e0d458b0-40cd-4eaf-8dbf-220566ae55ef nodeName:}" failed. No retries permitted until 2026-04-22 18:45:53.037828168 +0000 UTC m=+5.136178087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs") pod "network-metrics-daemon-v6b2n" (UID: "e0d458b0-40cd-4eaf-8dbf-220566ae55ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:51.138821 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:51.138789 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:51.139012 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.138957 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:45:51.139012 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.138978 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:45:51.139012 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.138991 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ljlj7 for pod openshift-network-diagnostics/network-check-target-bvpqn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:51.139174 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.139051 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7 podName:fb5bea5b-4447-44e8-8573-662eda69835e nodeName:}" failed. No retries permitted until 2026-04-22 18:45:53.13903149 +0000 UTC m=+5.237381414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljlj7" (UniqueName: "kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7") pod "network-check-target-bvpqn" (UID: "fb5bea5b-4447-44e8-8573-662eda69835e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:51.358965 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:51.358864 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:40:49 +0000 UTC" deadline="2027-11-14 01:55:37.408026454 +0000 UTC" Apr 22 18:45:51.358965 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:51.358906 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13687h9m46.049124795s" Apr 22 18:45:51.424008 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:51.423503 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:51.424008 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:51.423550 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:51.424008 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.423649 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:45:51.424008 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:51.423845 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:45:53.056671 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:53.056610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:53.057122 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.056796 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:53.057122 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.056873 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs podName:e0d458b0-40cd-4eaf-8dbf-220566ae55ef nodeName:}" failed. No retries permitted until 2026-04-22 18:45:57.056852668 +0000 UTC m=+9.155202590 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs") pod "network-metrics-daemon-v6b2n" (UID: "e0d458b0-40cd-4eaf-8dbf-220566ae55ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:53.157091 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:53.156960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:53.157260 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.157122 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:45:53.157260 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.157145 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:45:53.157260 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.157159 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ljlj7 for pod openshift-network-diagnostics/network-check-target-bvpqn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:53.157260 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.157212 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7 podName:fb5bea5b-4447-44e8-8573-662eda69835e nodeName:}" failed. No retries permitted until 2026-04-22 18:45:57.157197808 +0000 UTC m=+9.255547726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljlj7" (UniqueName: "kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7") pod "network-check-target-bvpqn" (UID: "fb5bea5b-4447-44e8-8573-662eda69835e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:53.422847 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:53.422762 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:53.422999 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.422907 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:45:53.423092 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:53.423070 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:53.423205 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:53.423182 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:45:55.422997 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:55.422736 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:55.422997 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:55.422882 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:45:55.423465 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:55.423024 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:55.423465 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:55.423195 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:45:57.091937 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:57.091895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:57.092458 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.092055 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:57.092458 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.092125 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs podName:e0d458b0-40cd-4eaf-8dbf-220566ae55ef nodeName:}" failed. No retries permitted until 2026-04-22 18:46:05.092106554 +0000 UTC m=+17.190456474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs") pod "network-metrics-daemon-v6b2n" (UID: "e0d458b0-40cd-4eaf-8dbf-220566ae55ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:45:57.192290 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:57.192247 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:57.192475 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.192434 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:45:57.192475 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.192458 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:45:57.192475 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.192472 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ljlj7 for pod openshift-network-diagnostics/network-check-target-bvpqn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:57.192677 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.192550 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7 podName:fb5bea5b-4447-44e8-8573-662eda69835e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:05.192516243 +0000 UTC m=+17.290866173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljlj7" (UniqueName: "kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7") pod "network-check-target-bvpqn" (UID: "fb5bea5b-4447-44e8-8573-662eda69835e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:45:57.423817 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:57.423123 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:57.423817 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.423244 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:45:57.423817 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:57.423645 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:57.423817 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:57.423756 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:45:59.423273 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:59.423238 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:45:59.423728 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:45:59.423237 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:45:59.423728 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:59.423365 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:45:59.423728 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:45:59.423481 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:01.422793 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:01.422755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:01.423243 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:01.422861 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:01.423243 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:01.422923 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:01.423243 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:01.423059 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:03.422704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:03.422664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:03.423124 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:03.422664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:03.423124 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:03.422782 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:03.423124 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:03.422833 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:05.153925 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.153872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:05.154358 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.154035 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:05.154358 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.154105 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs podName:e0d458b0-40cd-4eaf-8dbf-220566ae55ef nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.154084786 +0000 UTC m=+33.252434706 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs") pod "network-metrics-daemon-v6b2n" (UID: "e0d458b0-40cd-4eaf-8dbf-220566ae55ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:05.254684 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.254646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:05.254921 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.254847 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:05.254921 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.254879 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:05.254921 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.254896 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ljlj7 for pod openshift-network-diagnostics/network-check-target-bvpqn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:05.255152 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.254962 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7 podName:fb5bea5b-4447-44e8-8573-662eda69835e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.254941453 +0000 UTC m=+33.353291378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljlj7" (UniqueName: "kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7") pod "network-check-target-bvpqn" (UID: "fb5bea5b-4447-44e8-8573-662eda69835e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:05.423197 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.423115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:05.423197 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.423132 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fk9jc"] Apr 22 18:46:05.423197 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.423115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:05.423453 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.423253 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:05.423453 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.423411 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:05.458663 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.458630 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.458816 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.458715 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:05.557264 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.557234 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-kubelet-config\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.557429 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.557285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-dbus\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.557429 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.557318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.658090 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.658058 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-kubelet-config\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.658279 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.658110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-dbus\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.658279 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.658142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.658279 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.658200 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-kubelet-config\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:05.658279 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.658260 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:05.658492 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:05.658324 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret podName:f793756b-ba29-4f1d-878b-9d4abe4d5ad3 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:06.15830681 +0000 UTC m=+18.256656743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret") pod "global-pull-secret-syncer-fk9jc" (UID: "f793756b-ba29-4f1d-878b-9d4abe4d5ad3") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:05.658492 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:05.658349 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-dbus\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:06.162624 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:06.162589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:06.163065 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:06.162750 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:06.163065 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:06.162833 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret podName:f793756b-ba29-4f1d-878b-9d4abe4d5ad3 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:07.162812321 +0000 UTC m=+19.261162250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret") pod "global-pull-secret-syncer-fk9jc" (UID: "f793756b-ba29-4f1d-878b-9d4abe4d5ad3") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:07.168726 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:07.168695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:07.169127 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:07.168855 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:07.169127 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:07.168922 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret podName:f793756b-ba29-4f1d-878b-9d4abe4d5ad3 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:09.168904127 +0000 UTC m=+21.267254045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret") pod "global-pull-secret-syncer-fk9jc" (UID: "f793756b-ba29-4f1d-878b-9d4abe4d5ad3") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:07.422973 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:07.422901 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:07.422973 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:07.422915 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:07.422973 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:07.422936 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:07.423305 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:07.423276 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:07.423848 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:07.423821 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:07.423953 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:07.423935 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:08.623752 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.623553 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsb2f" event={"ID":"4867ac2c-3d1c-44a3-b5d4-495f207482ed","Type":"ContainerStarted","Data":"01b00e93557f76f13c57a256e664e958848b3cd67392af2a63ce8e15f8e502a1"} Apr 22 18:46:08.624894 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.624868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" event={"ID":"380fabb8e9eee8f3530eb1504d622a92","Type":"ContainerStarted","Data":"39e665427b03960c171f6ff9d07317d2a766c57cf71ba36219b9a646bb533aed"} Apr 22 18:46:08.627460 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.627441 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 18:46:08.627892 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.627866 2571 generic.go:358] "Generic (PLEG): container finished" podID="97c5443c-b607-45ba-8245-88b3b1af7d19" containerID="6e92a034e329c56118cb751530e4367c381cbf28e4a8c9051fc1b5cc3efa29ae" exitCode=1 Apr 22 18:46:08.628028 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.627939 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"f388f0ce2f11b6c6db7ee56a43162a39380cf271bbe9d063d8631a6118804fd8"} Apr 22 18:46:08.628028 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.627968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"33745752ba0bc23853cbd0934d7dd1dcdf16e828c628e43a31102cd692138cc6"} Apr 22 18:46:08.628028 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.627983 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"f18848279fc97ffd41141ecbec361ff7e9548e9327c40cec8437b25831bbc150"} Apr 22 18:46:08.628028 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.628005 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"0286aeaf9f210cc56083fe736ffd44166853349d1639aa28f79b6e8bec9f733c"} Apr 22 18:46:08.628028 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.628021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerDied","Data":"6e92a034e329c56118cb751530e4367c381cbf28e4a8c9051fc1b5cc3efa29ae"} Apr 22 18:46:08.628284 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.628037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"10eb6dd007a42787a3f4f5b97140a3d076cfba70c52bc19606fd8743a19a095e"} Apr 22 18:46:08.629248 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.629220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" event={"ID":"72131875-3a6a-454e-a845-bdca533f20de","Type":"ContainerStarted","Data":"93937f7be3093e1859228d61c4558b7dbd8564649b32321a83f3dde483df3ea1"} Apr 22 18:46:08.655319 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.655277 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hsb2f" podStartSLOduration=2.38632258 podStartE2EDuration="20.655263375s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.683490357 +0000 UTC m=+1.781840274" lastFinishedPulling="2026-04-22 18:46:07.952431147 +0000 UTC m=+20.050781069" observedRunningTime="2026-04-22 18:46:08.637739875 +0000 UTC m=+20.736089816" watchObservedRunningTime="2026-04-22 18:46:08.655263375 +0000 UTC m=+20.753613314" Apr 22 18:46:08.667318 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:08.666904 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4wgsq" podStartSLOduration=2.847346136 podStartE2EDuration="20.666886712s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.750523373 +0000 UTC m=+1.848873290" lastFinishedPulling="2026-04-22 18:46:07.570063947 +0000 UTC m=+19.668413866" observedRunningTime="2026-04-22 18:46:08.655092493 +0000 UTC m=+20.753442433" watchObservedRunningTime="2026-04-22 18:46:08.666886712 +0000 UTC m=+20.765236653" Apr 22 18:46:09.184969 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.184937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:09.185183 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:09.185069 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:09.185183 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:09.185120 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret podName:f793756b-ba29-4f1d-878b-9d4abe4d5ad3 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:13.185106026 +0000 UTC m=+25.283455948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret") pod "global-pull-secret-syncer-fk9jc" (UID: "f793756b-ba29-4f1d-878b-9d4abe4d5ad3") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:09.422664 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.422596 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:09.422787 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.422660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:09.422787 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.422688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:09.422787 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:09.422762 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:09.422887 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:09.422836 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:09.422887 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:09.422875 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:09.631957 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.631931 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gvq84" event={"ID":"64e165b4-deda-4d6f-8f70-c28ac7cebec4","Type":"ContainerStarted","Data":"5d9dae5cc1bea96000619fa840385eaacd4bd7a318404922c1ea926df3a59048"} Apr 22 18:46:09.633093 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.633073 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" event={"ID":"36758514-4d64-4551-b091-9b23e243572e","Type":"ContainerStarted","Data":"6871978b41ca44dd76f66a44ae4b2ea0958a2e6e8705554a92bf9f6fc0cdd5d4"} Apr 22 18:46:09.634356 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.634336 2571 generic.go:358] "Generic (PLEG): container finished" podID="22922279-9d57-4b39-9e9b-25a133f37c1b" containerID="a030defec8ee7ee545c0a6feeb4f99d44df0865d61b0f0d51c1c0de7650ea740" exitCode=0 Apr 22 18:46:09.634434 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.634397 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerDied","Data":"a030defec8ee7ee545c0a6feeb4f99d44df0865d61b0f0d51c1c0de7650ea740"} Apr 22 18:46:09.635706 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.635674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rvx2v" event={"ID":"e63b9430-ddb4-4626-8019-7bfa90ffac77","Type":"ContainerStarted","Data":"66b949883889f337faa25d65f454a22aeb9024f142f79f429d96b794f796a9e0"} Apr 22 18:46:09.637359 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.637311 2571 generic.go:358] "Generic (PLEG): container finished" podID="d02647afefe84c2445ae8b27f0b7998e" containerID="87fb153aeac7a70e65a0b20084ac1fa58f825bc214995bcdd6689633acc7690c" exitCode=0 Apr 22 18:46:09.637437 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.637378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" event={"ID":"d02647afefe84c2445ae8b27f0b7998e","Type":"ContainerDied","Data":"87fb153aeac7a70e65a0b20084ac1fa58f825bc214995bcdd6689633acc7690c"} Apr 22 18:46:09.640175 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.639038 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2s28b" event={"ID":"91da0166-33f7-46f2-9824-bf2339c00a28","Type":"ContainerStarted","Data":"2a0655cbdd78f1f8484c44ea795963ce46ad4fe6bce084e94c80596c84abea19"} Apr 22 18:46:09.641479 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.641446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzzp6" event={"ID":"0c75569b-ad2e-4296-8dad-807a8c913df1","Type":"ContainerStarted","Data":"9295d2ff5e501d6be77636070c7bbea7fb4e38de5d6b08712c04bda28e11d00c"} Apr 22 18:46:09.644043 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.644010 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gvq84" podStartSLOduration=3.821295123 podStartE2EDuration="21.643999429s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.74662467 +0000 UTC m=+1.844974587" lastFinishedPulling="2026-04-22 18:46:07.569328975 +0000 UTC m=+19.667678893" observedRunningTime="2026-04-22 18:46:09.643968161 +0000 UTC m=+21.742318101" watchObservedRunningTime="2026-04-22 18:46:09.643999429 +0000 UTC m=+21.742349420" Apr 22 18:46:09.644512 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.644479 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-42.ec2.internal" podStartSLOduration=20.644470644 podStartE2EDuration="20.644470644s" podCreationTimestamp="2026-04-22 18:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:08.6666068 +0000 UTC m=+20.764956741" watchObservedRunningTime="2026-04-22 18:46:09.644470644 +0000 UTC m=+21.742820781" Apr 22 18:46:09.703210 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.702059 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2s28b" podStartSLOduration=3.922108007 podStartE2EDuration="21.702040595s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.712082097 +0000 UTC m=+1.810432014" lastFinishedPulling="2026-04-22 18:46:07.492014667 +0000 UTC m=+19.590364602" observedRunningTime="2026-04-22 18:46:09.689055964 +0000 UTC m=+21.787405907" watchObservedRunningTime="2026-04-22 18:46:09.702040595 +0000 UTC m=+21.800390537" Apr 22 18:46:09.703210 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.702501 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qzzp6" podStartSLOduration=3.849183245 podStartE2EDuration="21.702489685s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.638127872 +0000 UTC m=+1.736477790" lastFinishedPulling="2026-04-22 18:46:07.491434303 +0000 UTC m=+19.589784230" observedRunningTime="2026-04-22 18:46:09.702119573 +0000 UTC m=+21.800469514" watchObservedRunningTime="2026-04-22 18:46:09.702489685 +0000 UTC m=+21.800839627" Apr 22 18:46:09.715416 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.715375 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rvx2v" podStartSLOduration=3.871814207 podStartE2EDuration="21.71536238s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.647888125 +0000 UTC m=+1.746238043" lastFinishedPulling="2026-04-22 18:46:07.491436293 +0000 UTC m=+19.589786216" observedRunningTime="2026-04-22 18:46:09.714832921 +0000 UTC m=+21.813182861" watchObservedRunningTime="2026-04-22 18:46:09.71536238 +0000 UTC m=+21.813712319" Apr 22 18:46:09.772526 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:09.772491 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:46:10.390522 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.390399 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:46:09.772520719Z","UUID":"15ad5e85-b561-40f1-96de-e55ec1b9c9e3","Handler":null,"Name":"","Endpoint":""} Apr 22 18:46:10.394382 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.394345 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:46:10.394511 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.394386 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:46:10.645753 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.645658 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" event={"ID":"36758514-4d64-4551-b091-9b23e243572e","Type":"ContainerStarted","Data":"ca528dd23c286d9bb281efdd8b69f8e776044fed5b5e195671cc4cb6a83e4e45"} Apr 22 18:46:10.647857 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.647831 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" event={"ID":"d02647afefe84c2445ae8b27f0b7998e","Type":"ContainerStarted","Data":"88d1341fb69a19e834a09288c02885aafb5ae605c833d1637a8d32699b227eb5"} Apr 22 18:46:10.651281 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.651259 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 18:46:10.651742 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.651715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"febd359bcf5319b896d6d987f0acf6e4434c6b9cdb64d77f7eb0e2e53e29d180"} Apr 22 18:46:10.664614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:10.664563 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-42.ec2.internal" podStartSLOduration=21.66452946 podStartE2EDuration="21.66452946s" podCreationTimestamp="2026-04-22 18:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:10.664160987 +0000 UTC m=+22.762510928" watchObservedRunningTime="2026-04-22 18:46:10.66452946 +0000 UTC m=+22.762879399" Apr 22 18:46:11.423104 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:11.422884 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:11.423287 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:11.422937 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:11.423287 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:11.423194 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:11.423287 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:11.423276 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:11.423425 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:11.422937 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:11.423425 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:11.423378 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:11.656020 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:11.655975 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" event={"ID":"36758514-4d64-4551-b091-9b23e243572e","Type":"ContainerStarted","Data":"26b0b9360c1d02e74ca81c56c27aff29c4779da0a18a3ee51a91a95b5be1ba38"} Apr 22 18:46:11.672851 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:11.672808 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-27l6q" podStartSLOduration=2.635340643 podStartE2EDuration="23.672795257s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.724705381 +0000 UTC m=+1.823055299" lastFinishedPulling="2026-04-22 18:46:10.762159984 +0000 UTC m=+22.860509913" observedRunningTime="2026-04-22 18:46:11.672144308 +0000 UTC m=+23.770494249" watchObservedRunningTime="2026-04-22 18:46:11.672795257 +0000 UTC m=+23.771145197" Apr 22 18:46:13.218334 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.218290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:13.218839 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:13.218428 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:13.218839 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:13.218509 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret podName:f793756b-ba29-4f1d-878b-9d4abe4d5ad3 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.218489137 +0000 UTC m=+33.316839074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret") pod "global-pull-secret-syncer-fk9jc" (UID: "f793756b-ba29-4f1d-878b-9d4abe4d5ad3") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:13.423332 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.423297 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:13.423332 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.423328 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:13.423581 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.423297 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:13.423581 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:13.423426 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:13.423581 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:13.423551 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:13.423699 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:13.423659 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:13.646255 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.646166 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:46:13.646907 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.646876 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:46:13.659101 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.659077 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:46:13.659608 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:13.659580 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2s28b" Apr 22 18:46:14.662177 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.661876 2571 generic.go:358] "Generic (PLEG): container finished" podID="22922279-9d57-4b39-9e9b-25a133f37c1b" containerID="de63a113a4f977969f07395955a7cad2a120d543aa82389120c4fa09ceb8757a" exitCode=0 Apr 22 18:46:14.662855 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.661957 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerDied","Data":"de63a113a4f977969f07395955a7cad2a120d543aa82389120c4fa09ceb8757a"} Apr 22 18:46:14.665428 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.665408 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 18:46:14.665779 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.665756 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"16ccb68b272905f2e19c673b614c65f74a3b08c292a9c6b62bb7aa0e7cd91077"} Apr 22 18:46:14.666064 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.666044 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:46:14.666156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.666070 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:46:14.666156 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.666079 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:46:14.666254 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.666172 2571 scope.go:117] "RemoveContainer" containerID="6e92a034e329c56118cb751530e4367c381cbf28e4a8c9051fc1b5cc3efa29ae" Apr 22 18:46:14.682173 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.682150 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:46:14.682757 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:14.682739 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:46:15.423195 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.423165 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:15.423195 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.423185 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:15.423374 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.423185 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:15.423374 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:15.423258 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:15.423374 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:15.423357 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:15.423479 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:15.423417 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:15.671073 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.671048 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 18:46:15.671522 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.671398 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" event={"ID":"97c5443c-b607-45ba-8245-88b3b1af7d19","Type":"ContainerStarted","Data":"87524e43577b6f4aac3c6d1b079ce8bb252e20b2d3ec67818a21c06b5753af3e"} Apr 22 18:46:15.673429 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.673363 2571 generic.go:358] "Generic (PLEG): container finished" podID="22922279-9d57-4b39-9e9b-25a133f37c1b" containerID="2f029384c7d5562df0d4d52de6f033b975936148ae086113bc2095686c2bba25" exitCode=0 Apr 22 18:46:15.673551 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.673453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerDied","Data":"2f029384c7d5562df0d4d52de6f033b975936148ae086113bc2095686c2bba25"} Apr 22 18:46:15.701380 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:15.701327 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" podStartSLOduration=9.828212445 podStartE2EDuration="27.701310013s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.697831897 +0000 UTC m=+1.796181815" lastFinishedPulling="2026-04-22 18:46:07.570929466 +0000 UTC m=+19.669279383" observedRunningTime="2026-04-22 18:46:15.699717925 +0000 UTC m=+27.798067868" watchObservedRunningTime="2026-04-22 18:46:15.701310013 +0000 UTC m=+27.799659954" Apr 22 18:46:16.032208 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:16.032175 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fk9jc"] Apr 22 18:46:16.032347 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:16.032279 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:16.032383 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:16.032367 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:16.038223 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:16.035598 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v6b2n"] Apr 22 18:46:16.038223 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:16.035720 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:16.038223 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:16.035953 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:16.046474 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:16.046451 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bvpqn"] Apr 22 18:46:16.046583 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:16.046555 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:16.046670 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:16.046654 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:17.423514 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:17.423338 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:17.424018 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:17.423383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:17.424018 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:17.423603 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:17.424018 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:17.423411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:17.424018 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:17.423691 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:17.424018 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:17.423731 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:17.681168 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:17.681077 2571 generic.go:358] "Generic (PLEG): container finished" podID="22922279-9d57-4b39-9e9b-25a133f37c1b" containerID="2a75079f81b6e4f6ec6c3ff4b7d86f1544efbfe08795d42d43443a880c1ac862" exitCode=0 Apr 22 18:46:17.681318 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:17.681157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerDied","Data":"2a75079f81b6e4f6ec6c3ff4b7d86f1544efbfe08795d42d43443a880c1ac862"} Apr 22 18:46:19.422662 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:19.422609 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:19.423196 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:19.422621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:19.423196 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:19.422745 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fk9jc" podUID="f793756b-ba29-4f1d-878b-9d4abe4d5ad3" Apr 22 18:46:19.423196 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:19.422621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:19.423196 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:19.422859 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v6b2n" podUID="e0d458b0-40cd-4eaf-8dbf-220566ae55ef" Apr 22 18:46:19.423196 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:19.422961 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvpqn" podUID="fb5bea5b-4447-44e8-8573-662eda69835e" Apr 22 18:46:20.769614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.769575 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-42.ec2.internal" event="NodeReady" Apr 22 18:46:20.770117 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.769739 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:46:20.810297 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.810258 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7d6f96f686-fwq9s"] Apr 22 18:46:20.840201 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.840061 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77"] Apr 22 18:46:20.841431 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.840678 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.843524 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.843496 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mvb7n\"" Apr 22 18:46:20.843524 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.843515 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:46:20.843719 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.843501 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:46:20.843775 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.843503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:46:20.853885 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.853862 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:46:20.859453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.859405 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tnfzj"] Apr 22 18:46:20.859823 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.859742 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:20.862838 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.862815 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-lwgbm\"" Apr 22 18:46:20.862941 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.862824 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:46:20.863020 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.863002 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:46:20.863174 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.863144 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:46:20.863236 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.863184 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:46:20.883294 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.883271 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-56d9858896-wq4xk"] Apr 22 18:46:20.883439 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.883415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:20.885979 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.885960 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:46:20.886113 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.885997 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:46:20.886179 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.886143 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:46:20.886236 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.886215 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:46:20.886283 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.886223 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-7mc4b\"" Apr 22 18:46:20.900739 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.900714 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:46:20.913687 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.913664 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc"] Apr 22 18:46:20.913832 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.913814 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:20.916597 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.916575 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:46:20.916698 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.916604 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:46:20.916698 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.916627 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:46:20.916698 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.916575 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-47626\"" Apr 22 18:46:20.916964 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.916910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:46:20.917069 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.916969 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:46:20.917069 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.916975 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:46:20.933518 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.933425 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8"] Apr 22 18:46:20.933676 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.933555 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" Apr 22 18:46:20.936529 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.936504 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:20.936653 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.936590 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:20.936653 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.936591 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-znnv5\"" Apr 22 18:46:20.950437 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.950416 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t"] Apr 22 18:46:20.950614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.950596 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:20.953268 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.953250 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:20.953675 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.953654 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:46:20.953781 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.953768 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:20.953846 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.953785 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-mhcfz\"" Apr 22 18:46:20.954653 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.954634 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:46:20.969136 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.969106 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5h829"] Apr 22 18:46:20.969259 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.969242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:20.972296 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.972102 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:46:20.972296 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.972135 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cktvr\"" Apr 22 18:46:20.972296 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.972239 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:20.972776 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.972591 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:20.984433 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-image-registry-private-configuration\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.984530 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:20.984530 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-bound-sa-token\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.984676 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7mv\" (UniqueName: \"kubernetes.io/projected/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-kube-api-access-ch7mv\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:20.984676 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e69f6d-8332-4923-a03d-15387087dc5a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:20.984676 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e69f6d-8332-4923-a03d-15387087dc5a-serving-cert\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:20.984827 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4sz\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-kube-api-access-bp4sz\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.984827 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/80e69f6d-8332-4923-a03d-15387087dc5a-tmp\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:20.984827 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e69f6d-8332-4923-a03d-15387087dc5a-service-ca-bundle\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:20.984827 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7kz4\" (UniqueName: \"kubernetes.io/projected/80e69f6d-8332-4923-a03d-15387087dc5a-kube-api-access-n7kz4\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:20.984976 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984862 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.984976 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-trusted-ca\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.984976 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/80e69f6d-8332-4923-a03d-15387087dc5a-snapshots\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:20.984976 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984931 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-certificates\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.984976 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984948 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a07f3c68-23f1-4479-83d7-6e71fb29694e-ca-trust-extracted\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.984976 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-installation-pull-secrets\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:20.985250 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.984976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:20.988459 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.988414 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77"] Apr 22 18:46:20.988459 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.988436 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d6f96f686-fwq9s"] Apr 22 18:46:20.988459 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.988446 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lb4jp"] Apr 22 18:46:20.988663 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.988577 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:20.991335 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.991298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:46:20.991443 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.991351 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:46:20.991510 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.991478 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:20.991811 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.991642 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-g7jm4\"" Apr 22 18:46:20.991811 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.991755 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:20.997238 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:20.997220 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:46:21.006676 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.006354 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc"] Apr 22 18:46:21.006676 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.006492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.006835 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.006701 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc"] Apr 22 18:46:21.009521 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.009498 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:46:21.009642 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.009570 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnqbz\"" Apr 22 18:46:21.009798 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.009778 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:46:21.018117 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.018097 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4njjm"] Apr 22 18:46:21.018263 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.018247 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.021466 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.021420 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:21.021575 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.021483 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:21.021575 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.021524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:46:21.021575 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.021527 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:46:21.021739 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.021662 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hkjjx\"" Apr 22 18:46:21.031110 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.031093 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx"] Apr 22 18:46:21.031262 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.031244 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:21.033827 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.033808 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:46:21.033921 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.033895 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5j5kr\"" Apr 22 18:46:21.034217 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.034198 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:46:21.034550 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.034522 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:46:21.045228 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045209 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5h829"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045236 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56d9858896-wq4xk"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045250 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4njjm"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045274 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lb4jp"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045285 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045295 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045306 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc"] Apr 22 18:46:21.045337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045316 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tnfzj"] Apr 22 18:46:21.045785 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.045340 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" Apr 22 18:46:21.048312 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.048292 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:46:21.048408 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.048293 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:46:21.048408 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.048355 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-dq2t8\"" Apr 22 18:46:21.085970 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.085946 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2tl\" (UniqueName: \"kubernetes.io/projected/e8cf109c-0a09-449e-87f3-54ad5a412455-kube-api-access-7x2tl\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:21.086087 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.085986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/166524a6-4f60-49b8-9020-7bae0d51168c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.086087 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9bp\" (UniqueName: \"kubernetes.io/projected/a1739ddc-1770-4db7-a55f-9bc8b4cf2c65-kube-api-access-hr9bp\") pod \"volume-data-source-validator-7c6cbb6c87-fdhpc\" (UID: \"a1739ddc-1770-4db7-a55f-9bc8b4cf2c65\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" Apr 22 18:46:21.086217 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-config\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.086217 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e69f6d-8332-4923-a03d-15387087dc5a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.086217 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086165 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/80e69f6d-8332-4923-a03d-15387087dc5a-tmp\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.086217 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086192 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-stats-auth\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.086422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw64h\" (UniqueName: \"kubernetes.io/projected/0e27f58c-14be-45e2-8336-0577f514ae76-kube-api-access-vw64h\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.086422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.086422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpjh7\" (UniqueName: \"kubernetes.io/projected/166524a6-4f60-49b8-9020-7bae0d51168c-kube-api-access-zpjh7\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.086422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-certificates\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.086422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086351 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:21.086422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr4k4\" (UniqueName: \"kubernetes.io/projected/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-kube-api-access-wr4k4\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.086422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086407 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-serving-cert\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-bound-sa-token\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7mv\" (UniqueName: \"kubernetes.io/projected/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-kube-api-access-ch7mv\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e69f6d-8332-4923-a03d-15387087dc5a-serving-cert\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4sz\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-kube-api-access-bp4sz\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-default-certificate\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/80e69f6d-8332-4923-a03d-15387087dc5a-tmp\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086670 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e69f6d-8332-4923-a03d-15387087dc5a-service-ca-bundle\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086710 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7kz4\" (UniqueName: \"kubernetes.io/projected/80e69f6d-8332-4923-a03d-15387087dc5a-kube-api-access-n7kz4\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-trusted-ca\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166524a6-4f60-49b8-9020-7bae0d51168c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/80e69f6d-8332-4923-a03d-15387087dc5a-snapshots\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.086920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086923 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a07f3c68-23f1-4479-83d7-6e71fb29694e-ca-trust-extracted\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.086980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-installation-pull-secrets\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.087008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-trusted-ca\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.087050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-image-registry-private-configuration\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.087133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e69f6d-8332-4923-a03d-15387087dc5a-service-ca-bundle\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.087133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80e69f6d-8332-4923-a03d-15387087dc5a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.087265 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.087280 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d6f96f686-fwq9s: secret "image-registry-tls" not found Apr 22 18:46:21.087644 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.087344 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls podName:a07f3c68-23f1-4479-83d7-6e71fb29694e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.587327091 +0000 UTC m=+33.685677041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls") pod "image-registry-7d6f96f686-fwq9s" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e") : secret "image-registry-tls" not found Apr 22 18:46:21.088031 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.087669 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:21.088031 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.087725 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls podName:d5b60b0a-0c13-4d95-83e6-b6d336565a6b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.587709027 +0000 UTC m=+33.686058950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9jp77" (UID: "d5b60b0a-0c13-4d95-83e6-b6d336565a6b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:21.088031 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.087856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-certificates\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.088334 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.088311 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:21.088450 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.088420 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a07f3c68-23f1-4479-83d7-6e71fb29694e-ca-trust-extracted\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.088516 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.088459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/80e69f6d-8332-4923-a03d-15387087dc5a-snapshots\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.088613 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.088584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-trusted-ca\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.092053 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.092030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e69f6d-8332-4923-a03d-15387087dc5a-serving-cert\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.092135 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.092085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-installation-pull-secrets\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.092135 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.092085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-image-registry-private-configuration\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.095825 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.095801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-bound-sa-token\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.095927 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.095868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7kz4\" (UniqueName: \"kubernetes.io/projected/80e69f6d-8332-4923-a03d-15387087dc5a-kube-api-access-n7kz4\") pod \"insights-operator-585dfdc468-tnfzj\" (UID: \"80e69f6d-8332-4923-a03d-15387087dc5a\") " pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.096900 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.096861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4sz\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-kube-api-access-bp4sz\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.097585 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.097394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7mv\" (UniqueName: \"kubernetes.io/projected/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-kube-api-access-ch7mv\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:21.188084 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-trusted-ca\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.188273 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2tl\" (UniqueName: \"kubernetes.io/projected/e8cf109c-0a09-449e-87f3-54ad5a412455-kube-api-access-7x2tl\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:21.188273 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/166524a6-4f60-49b8-9020-7bae0d51168c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.188273 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9bp\" (UniqueName: \"kubernetes.io/projected/a1739ddc-1770-4db7-a55f-9bc8b4cf2c65-kube-api-access-hr9bp\") pod \"volume-data-source-validator-7c6cbb6c87-fdhpc\" (UID: \"a1739ddc-1770-4db7-a55f-9bc8b4cf2c65\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" Apr 22 18:46:21.188273 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e2749f-84dd-40f0-b864-c5aaddc913a8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.188273 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e2749f-84dd-40f0-b864-c5aaddc913a8-config\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.188273 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d429a1-a93e-47db-bed8-196a5bb0f748-config-volume\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.188614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-config\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.188614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188363 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbfjj\" (UniqueName: \"kubernetes.io/projected/a2f5711c-2c58-47dd-a89b-ab792485adbf-kube-api-access-wbfjj\") pod \"network-check-source-8894fc9bd-kxklx\" (UID: \"a2f5711c-2c58-47dd-a89b-ab792485adbf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" Apr 22 18:46:21.188614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-stats-auth\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.188614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9d429a1-a93e-47db-bed8-196a5bb0f748-tmp-dir\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.188614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw64h\" (UniqueName: \"kubernetes.io/projected/0e27f58c-14be-45e2-8336-0577f514ae76-kube-api-access-vw64h\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.188614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188504 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcx9h\" (UniqueName: \"kubernetes.io/projected/f9d429a1-a93e-47db-bed8-196a5bb0f748-kube-api-access-jcx9h\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.188614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpjh7\" (UniqueName: \"kubernetes.io/projected/166524a6-4f60-49b8-9020-7bae0d51168c-kube-api-access-zpjh7\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr4k4\" (UniqueName: \"kubernetes.io/projected/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-kube-api-access-wr4k4\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-serving-cert\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-default-certificate\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188839 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk8nc\" (UniqueName: \"kubernetes.io/projected/38e2749f-84dd-40f0-b864-c5aaddc913a8-kube-api-access-mk8nc\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsqt6\" (UniqueName: \"kubernetes.io/projected/e23ae0a9-f762-4687-bfe0-c02d473142be-kube-api-access-gsqt6\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:21.188954 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166524a6-4f60-49b8-9020-7bae0d51168c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.188967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.189042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-trusted-ca\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.189083 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.189149 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls podName:e8cf109c-0a09-449e-87f3-54ad5a412455 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.689130442 +0000 UTC m=+33.787480365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fnw4t" (UID: "e8cf109c-0a09-449e-87f3-54ad5a412455") : secret "samples-operator-tls" not found Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.189162 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.189221 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs podName:e0d458b0-40cd-4eaf-8dbf-220566ae55ef nodeName:}" failed. No retries permitted until 2026-04-22 18:46:53.189205297 +0000 UTC m=+65.287555218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs") pod "network-metrics-daemon-v6b2n" (UID: "e0d458b0-40cd-4eaf-8dbf-220566ae55ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.189282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-config\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.189300 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.689289029 +0000 UTC m=+33.787638981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : configmap references non-existent config key: service-ca.crt Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.189386 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:46:21.189419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.189420 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.689409263 +0000 UTC m=+33.787759195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : secret "router-metrics-certs-default" not found Apr 22 18:46:21.189878 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.189743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166524a6-4f60-49b8-9020-7bae0d51168c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.191404 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.191361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/166524a6-4f60-49b8-9020-7bae0d51168c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.191610 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.191585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-stats-auth\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.192879 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.192857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-serving-cert\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.193722 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.193705 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-tnfzj" Apr 22 18:46:21.199574 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.199507 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpjh7\" (UniqueName: \"kubernetes.io/projected/166524a6-4f60-49b8-9020-7bae0d51168c-kube-api-access-zpjh7\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwsv8\" (UID: \"166524a6-4f60-49b8-9020-7bae0d51168c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.199663 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.199578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr4k4\" (UniqueName: \"kubernetes.io/projected/12bf8f63-f32e-40cf-b529-9b5c1f6a9053-kube-api-access-wr4k4\") pod \"console-operator-9d4b6777b-5h829\" (UID: \"12bf8f63-f32e-40cf-b529-9b5c1f6a9053\") " pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.199920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.199864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw64h\" (UniqueName: \"kubernetes.io/projected/0e27f58c-14be-45e2-8336-0577f514ae76-kube-api-access-vw64h\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.199920 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.199909 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-default-certificate\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.200336 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.200315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2tl\" (UniqueName: \"kubernetes.io/projected/e8cf109c-0a09-449e-87f3-54ad5a412455-kube-api-access-7x2tl\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:21.206263 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.206239 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9bp\" (UniqueName: \"kubernetes.io/projected/a1739ddc-1770-4db7-a55f-9bc8b4cf2c65-kube-api-access-hr9bp\") pod \"volume-data-source-validator-7c6cbb6c87-fdhpc\" (UID: \"a1739ddc-1770-4db7-a55f-9bc8b4cf2c65\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" Apr 22 18:46:21.243118 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.243086 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" Apr 22 18:46:21.260339 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.260315 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" Apr 22 18:46:21.290417 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.290583 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk8nc\" (UniqueName: \"kubernetes.io/projected/38e2749f-84dd-40f0-b864-c5aaddc913a8-kube-api-access-mk8nc\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.290583 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.290433 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:21.290583 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsqt6\" (UniqueName: \"kubernetes.io/projected/e23ae0a9-f762-4687-bfe0-c02d473142be-kube-api-access-gsqt6\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:21.290583 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:21.290583 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.290526 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls podName:f9d429a1-a93e-47db-bed8-196a5bb0f748 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.790503496 +0000 UTC m=+33.888853435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls") pod "dns-default-lb4jp" (UID: "f9d429a1-a93e-47db-bed8-196a5bb0f748") : secret "dns-default-metrics-tls" not found Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e2749f-84dd-40f0-b864-c5aaddc913a8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e2749f-84dd-40f0-b864-c5aaddc913a8-config\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d429a1-a93e-47db-bed8-196a5bb0f748-config-volume\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290696 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbfjj\" (UniqueName: \"kubernetes.io/projected/a2f5711c-2c58-47dd-a89b-ab792485adbf-kube-api-access-wbfjj\") pod \"network-check-source-8894fc9bd-kxklx\" (UID: \"a2f5711c-2c58-47dd-a89b-ab792485adbf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290742 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9d429a1-a93e-47db-bed8-196a5bb0f748-tmp-dir\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcx9h\" (UniqueName: \"kubernetes.io/projected/f9d429a1-a93e-47db-bed8-196a5bb0f748-kube-api-access-jcx9h\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:21.290864 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.290861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:21.291314 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.291024 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:21.291314 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.291065 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert podName:e23ae0a9-f762-4687-bfe0-c02d473142be nodeName:}" failed. No retries permitted until 2026-04-22 18:46:21.791051719 +0000 UTC m=+33.889401639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert") pod "ingress-canary-4njjm" (UID: "e23ae0a9-f762-4687-bfe0-c02d473142be") : secret "canary-serving-cert" not found Apr 22 18:46:21.291420 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.291383 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d429a1-a93e-47db-bed8-196a5bb0f748-config-volume\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.292193 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.291849 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:21.292193 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.291899 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret podName:f793756b-ba29-4f1d-878b-9d4abe4d5ad3 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:37.291884726 +0000 UTC m=+49.390234666 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret") pod "global-pull-secret-syncer-fk9jc" (UID: "f793756b-ba29-4f1d-878b-9d4abe4d5ad3") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:21.292193 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.292152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9d429a1-a93e-47db-bed8-196a5bb0f748-tmp-dir\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.292430 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.292301 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e2749f-84dd-40f0-b864-c5aaddc913a8-config\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.294599 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.294554 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljlj7\" (UniqueName: \"kubernetes.io/projected/fb5bea5b-4447-44e8-8573-662eda69835e-kube-api-access-ljlj7\") pod \"network-check-target-bvpqn\" (UID: \"fb5bea5b-4447-44e8-8573-662eda69835e\") " pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:21.294770 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.294733 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e2749f-84dd-40f0-b864-c5aaddc913a8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.309282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.299349 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:21.309282 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.309049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk8nc\" (UniqueName: \"kubernetes.io/projected/38e2749f-84dd-40f0-b864-c5aaddc913a8-kube-api-access-mk8nc\") pod \"service-ca-operator-d6fc45fc5-vqvdc\" (UID: \"38e2749f-84dd-40f0-b864-c5aaddc913a8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.309855 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.309713 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbfjj\" (UniqueName: \"kubernetes.io/projected/a2f5711c-2c58-47dd-a89b-ab792485adbf-kube-api-access-wbfjj\") pod \"network-check-source-8894fc9bd-kxklx\" (UID: \"a2f5711c-2c58-47dd-a89b-ab792485adbf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" Apr 22 18:46:21.310242 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.310020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsqt6\" (UniqueName: \"kubernetes.io/projected/e23ae0a9-f762-4687-bfe0-c02d473142be-kube-api-access-gsqt6\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:21.312570 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.310737 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcx9h\" (UniqueName: \"kubernetes.io/projected/f9d429a1-a93e-47db-bed8-196a5bb0f748-kube-api-access-jcx9h\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.327966 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.327834 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" Apr 22 18:46:21.354618 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.354245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" Apr 22 18:46:21.381214 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.381104 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tnfzj"] Apr 22 18:46:21.392642 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.391681 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc"] Apr 22 18:46:21.421962 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.421908 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8"] Apr 22 18:46:21.423438 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.423185 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:21.424650 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.423730 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:21.424650 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.423742 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:21.427769 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.427389 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4wrrh\"" Apr 22 18:46:21.427769 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.427414 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:46:21.427769 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.427583 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:46:21.427769 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.427690 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-scwsl\"" Apr 22 18:46:21.446788 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.446752 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:21.476455 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.476416 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5h829"] Apr 22 18:46:21.478698 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.478677 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc"] Apr 22 18:46:21.594185 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.594089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:21.594185 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.594147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:21.594431 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.594205 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:21.594431 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.594222 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d6f96f686-fwq9s: secret "image-registry-tls" not found Apr 22 18:46:21.594431 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.594261 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:21.594431 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.594284 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls podName:a07f3c68-23f1-4479-83d7-6e71fb29694e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.594264964 +0000 UTC m=+34.692614897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls") pod "image-registry-7d6f96f686-fwq9s" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e") : secret "image-registry-tls" not found Apr 22 18:46:21.594431 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.594324 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls podName:d5b60b0a-0c13-4d95-83e6-b6d336565a6b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.594306611 +0000 UTC m=+34.692656544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9jp77" (UID: "d5b60b0a-0c13-4d95-83e6-b6d336565a6b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:21.695712 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.695674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:21.695917 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.695827 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:46:21.695917 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.695831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.695917 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.695895 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls podName:e8cf109c-0a09-449e-87f3-54ad5a412455 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.695872173 +0000 UTC m=+34.794222149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fnw4t" (UID: "e8cf109c-0a09-449e-87f3-54ad5a412455") : secret "samples-operator-tls" not found Apr 22 18:46:21.696086 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.695945 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.695920601 +0000 UTC m=+34.794270524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : configmap references non-existent config key: service-ca.crt Apr 22 18:46:21.696086 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.696019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:21.696164 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.696129 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:46:21.696207 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.696170 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.696156588 +0000 UTC m=+34.794506515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : secret "router-metrics-certs-default" not found Apr 22 18:46:21.796666 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.796624 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:21.797112 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:21.796722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:21.797112 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.796787 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:21.797112 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.796850 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert podName:e23ae0a9-f762-4687-bfe0-c02d473142be nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.796832947 +0000 UTC m=+34.895182866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert") pod "ingress-canary-4njjm" (UID: "e23ae0a9-f762-4687-bfe0-c02d473142be") : secret "canary-serving-cert" not found Apr 22 18:46:21.797112 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.796891 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:21.797112 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:21.796936 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls podName:f9d429a1-a93e-47db-bed8-196a5bb0f748 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.796925572 +0000 UTC m=+34.895275493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls") pod "dns-default-lb4jp" (UID: "f9d429a1-a93e-47db-bed8-196a5bb0f748") : secret "dns-default-metrics-tls" not found Apr 22 18:46:22.604137 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:22.604097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:22.604137 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:22.604144 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:22.604391 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.604258 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:22.604391 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.604291 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d6f96f686-fwq9s: secret "image-registry-tls" not found Apr 22 18:46:22.604391 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.604350 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls podName:a07f3c68-23f1-4479-83d7-6e71fb29694e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.60433472 +0000 UTC m=+36.702684637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls") pod "image-registry-7d6f96f686-fwq9s" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e") : secret "image-registry-tls" not found Apr 22 18:46:22.604391 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.604263 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:22.604626 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.604423 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls podName:d5b60b0a-0c13-4d95-83e6-b6d336565a6b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.60440182 +0000 UTC m=+36.702751741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9jp77" (UID: "d5b60b0a-0c13-4d95-83e6-b6d336565a6b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:22.705275 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:22.705235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:22.705473 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:22.705308 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:22.705473 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:22.705352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:22.705473 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.705413 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.705395823 +0000 UTC m=+36.803745744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : configmap references non-existent config key: service-ca.crt Apr 22 18:46:22.705672 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.705473 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:46:22.705672 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.705552 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.705518637 +0000 UTC m=+36.803868575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : secret "router-metrics-certs-default" not found Apr 22 18:46:22.705672 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.705476 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:46:22.705672 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.705607 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls podName:e8cf109c-0a09-449e-87f3-54ad5a412455 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.705597052 +0000 UTC m=+36.803946971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fnw4t" (UID: "e8cf109c-0a09-449e-87f3-54ad5a412455") : secret "samples-operator-tls" not found Apr 22 18:46:22.806577 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:22.806531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:22.807035 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:22.806625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:22.807035 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.806700 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:22.807035 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.806775 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert podName:e23ae0a9-f762-4687-bfe0-c02d473142be nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.806754067 +0000 UTC m=+36.905103998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert") pod "ingress-canary-4njjm" (UID: "e23ae0a9-f762-4687-bfe0-c02d473142be") : secret "canary-serving-cert" not found Apr 22 18:46:22.807035 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.806803 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:22.807035 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:22.806859 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls podName:f9d429a1-a93e-47db-bed8-196a5bb0f748 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.806843015 +0000 UTC m=+36.905192940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls") pod "dns-default-lb4jp" (UID: "f9d429a1-a93e-47db-bed8-196a5bb0f748") : secret "dns-default-metrics-tls" not found Apr 22 18:46:23.486016 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.485980 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5"] Apr 22 18:46:23.528624 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.528588 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl"] Apr 22 18:46:23.528828 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.528678 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.531601 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.531558 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-6vlpg\"" Apr 22 18:46:23.531726 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.531608 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:46:23.531791 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.531716 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:46:23.532786 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.532768 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:46:23.532786 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.532774 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:46:23.543564 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.543499 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r"] Apr 22 18:46:23.548293 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.544334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.549006 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.548984 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:46:23.563139 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:23.563113 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166524a6_4f60_49b8_9020_7bae0d51168c.slice/crio-748085f6b7a621b82a66921072eb0032098f952866a8c6149964f3bab0955c05 WatchSource:0}: Error finding container 748085f6b7a621b82a66921072eb0032098f952866a8c6149964f3bab0955c05: Status 404 returned error can't find the container with id 748085f6b7a621b82a66921072eb0032098f952866a8c6149964f3bab0955c05 Apr 22 18:46:23.564689 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:23.564110 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80e69f6d_8332_4923_a03d_15387087dc5a.slice/crio-2c8ea49206995e7bc6ec71179eb1b5381343a46b4bf0291b4bfd8d258efbc2c1 WatchSource:0}: Error finding container 2c8ea49206995e7bc6ec71179eb1b5381343a46b4bf0291b4bfd8d258efbc2c1: Status 404 returned error can't find the container with id 2c8ea49206995e7bc6ec71179eb1b5381343a46b4bf0291b4bfd8d258efbc2c1 Apr 22 18:46:23.565344 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:23.565327 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12bf8f63_f32e_40cf_b529_9b5c1f6a9053.slice/crio-bae89656d222d0b113fae05549574aaed32a7919858865b13a2fab2158229b8b WatchSource:0}: Error finding container bae89656d222d0b113fae05549574aaed32a7919858865b13a2fab2158229b8b: Status 404 returned error can't find the container with id bae89656d222d0b113fae05549574aaed32a7919858865b13a2fab2158229b8b Apr 22 18:46:23.566347 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:23.566324 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1739ddc_1770_4db7_a55f_9bc8b4cf2c65.slice/crio-836ac1ee05e980fcb3359c472d0fc78f69242db4cb995472356dabd89e99a998 WatchSource:0}: Error finding container 836ac1ee05e980fcb3359c472d0fc78f69242db4cb995472356dabd89e99a998: Status 404 returned error can't find the container with id 836ac1ee05e980fcb3359c472d0fc78f69242db4cb995472356dabd89e99a998 Apr 22 18:46:23.567592 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:23.567559 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e2749f_84dd_40f0_b864_c5aaddc913a8.slice/crio-d1b09732c6280abcbbd54f1e2274472e625934593196e15435bec9d3deeb60fb WatchSource:0}: Error finding container d1b09732c6280abcbbd54f1e2274472e625934593196e15435bec9d3deeb60fb: Status 404 returned error can't find the container with id d1b09732c6280abcbbd54f1e2274472e625934593196e15435bec9d3deeb60fb Apr 22 18:46:23.569729 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.569705 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5"] Apr 22 18:46:23.569813 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.569740 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl"] Apr 22 18:46:23.569813 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.569753 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r"] Apr 22 18:46:23.569924 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.569907 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.573059 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.573040 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:46:23.573379 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.573356 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:46:23.573641 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.573624 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:46:23.573843 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.573826 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:46:23.612474 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612406 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-ca\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.612679 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.612679 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69d4s\" (UniqueName: \"kubernetes.io/projected/69e4cebf-e0c9-4dbf-9863-ed0cc7b49669-kube-api-access-69d4s\") pod \"managed-serviceaccount-addon-agent-8686b48ddc-cx2n5\" (UID: \"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.612679 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.612679 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/833c11fe-7743-42be-ba1f-d3d7dbb6678c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.612890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvptm\" (UniqueName: \"kubernetes.io/projected/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-kube-api-access-mvptm\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.612890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612767 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-hub\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.612890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612795 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.612890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/833c11fe-7743-42be-ba1f-d3d7dbb6678c-tmp\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.612890 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfwz\" (UniqueName: \"kubernetes.io/projected/833c11fe-7743-42be-ba1f-d3d7dbb6678c-kube-api-access-jkfwz\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.613119 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.612899 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69e4cebf-e0c9-4dbf-9863-ed0cc7b49669-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8686b48ddc-cx2n5\" (UID: \"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.695141 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.695101 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bvpqn"] Apr 22 18:46:23.695752 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.695725 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" event={"ID":"12bf8f63-f32e-40cf-b529-9b5c1f6a9053","Type":"ContainerStarted","Data":"bae89656d222d0b113fae05549574aaed32a7919858865b13a2fab2158229b8b"} Apr 22 18:46:23.697325 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.696900 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" event={"ID":"a1739ddc-1770-4db7-a55f-9bc8b4cf2c65","Type":"ContainerStarted","Data":"836ac1ee05e980fcb3359c472d0fc78f69242db4cb995472356dabd89e99a998"} Apr 22 18:46:23.698300 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.698274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" event={"ID":"38e2749f-84dd-40f0-b864-c5aaddc913a8","Type":"ContainerStarted","Data":"d1b09732c6280abcbbd54f1e2274472e625934593196e15435bec9d3deeb60fb"} Apr 22 18:46:23.698815 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.698795 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx"] Apr 22 18:46:23.699499 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.699482 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tnfzj" event={"ID":"80e69f6d-8332-4923-a03d-15387087dc5a","Type":"ContainerStarted","Data":"2c8ea49206995e7bc6ec71179eb1b5381343a46b4bf0291b4bfd8d258efbc2c1"} Apr 22 18:46:23.699982 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:23.699940 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5bea5b_4447_44e8_8573_662eda69835e.slice/crio-02dc32de81b3942d9ebc38caf97d863ddc168dcede7090dc7876257be696e492 WatchSource:0}: Error finding container 02dc32de81b3942d9ebc38caf97d863ddc168dcede7090dc7876257be696e492: Status 404 returned error can't find the container with id 02dc32de81b3942d9ebc38caf97d863ddc168dcede7090dc7876257be696e492 Apr 22 18:46:23.700966 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.700841 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" event={"ID":"166524a6-4f60-49b8-9020-7bae0d51168c","Type":"ContainerStarted","Data":"748085f6b7a621b82a66921072eb0032098f952866a8c6149964f3bab0955c05"} Apr 22 18:46:23.701668 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:23.701648 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f5711c_2c58_47dd_a89b_ab792485adbf.slice/crio-65f940de24813e6d5ae7ed3f2c4c2c9d123dbe831826995dc4b29ba25e887457 WatchSource:0}: Error finding container 65f940de24813e6d5ae7ed3f2c4c2c9d123dbe831826995dc4b29ba25e887457: Status 404 returned error can't find the container with id 65f940de24813e6d5ae7ed3f2c4c2c9d123dbe831826995dc4b29ba25e887457 Apr 22 18:46:23.713310 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.713420 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/833c11fe-7743-42be-ba1f-d3d7dbb6678c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.713420 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713363 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvptm\" (UniqueName: \"kubernetes.io/projected/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-kube-api-access-mvptm\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.713504 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-hub\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.713504 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713445 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.713504 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/833c11fe-7743-42be-ba1f-d3d7dbb6678c-tmp\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.713504 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfwz\" (UniqueName: \"kubernetes.io/projected/833c11fe-7743-42be-ba1f-d3d7dbb6678c-kube-api-access-jkfwz\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.713773 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69e4cebf-e0c9-4dbf-9863-ed0cc7b49669-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8686b48ddc-cx2n5\" (UID: \"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.713773 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-ca\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.713773 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.713773 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.713663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69d4s\" (UniqueName: \"kubernetes.io/projected/69e4cebf-e0c9-4dbf-9863-ed0cc7b49669-kube-api-access-69d4s\") pod \"managed-serviceaccount-addon-agent-8686b48ddc-cx2n5\" (UID: \"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.714230 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.714205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.717772 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.717620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-ca\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.717862 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.717670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.717862 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.717729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.717862 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.717849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69e4cebf-e0c9-4dbf-9863-ed0cc7b49669-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8686b48ddc-cx2n5\" (UID: \"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.717961 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.717922 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-hub\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.720953 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.720933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/833c11fe-7743-42be-ba1f-d3d7dbb6678c-tmp\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.721175 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.721160 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfwz\" (UniqueName: \"kubernetes.io/projected/833c11fe-7743-42be-ba1f-d3d7dbb6678c-kube-api-access-jkfwz\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.721218 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.721208 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/833c11fe-7743-42be-ba1f-d3d7dbb6678c-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cf8c4bbff-b96bl\" (UID: \"833c11fe-7743-42be-ba1f-d3d7dbb6678c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.721363 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.721341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvptm\" (UniqueName: \"kubernetes.io/projected/714499d9-f67c-41fd-bb2a-dcdf1721b2b9-kube-api-access-mvptm\") pod \"cluster-proxy-proxy-agent-74878b968b-8gv5r\" (UID: \"714499d9-f67c-41fd-bb2a-dcdf1721b2b9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:23.721398 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.721347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69d4s\" (UniqueName: \"kubernetes.io/projected/69e4cebf-e0c9-4dbf-9863-ed0cc7b49669-kube-api-access-69d4s\") pod \"managed-serviceaccount-addon-agent-8686b48ddc-cx2n5\" (UID: \"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.849958 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.849873 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" Apr 22 18:46:23.857042 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.857021 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:23.894334 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:23.894300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:46:24.017226 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.017136 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl"] Apr 22 18:46:24.018780 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.018755 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5"] Apr 22 18:46:24.025748 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:24.025718 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod833c11fe_7743_42be_ba1f_d3d7dbb6678c.slice/crio-43cf88b1ebd94888bbf969726ff5b81ea933bea97d3252f13d3b1132fcf95acb WatchSource:0}: Error finding container 43cf88b1ebd94888bbf969726ff5b81ea933bea97d3252f13d3b1132fcf95acb: Status 404 returned error can't find the container with id 43cf88b1ebd94888bbf969726ff5b81ea933bea97d3252f13d3b1132fcf95acb Apr 22 18:46:24.025983 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:24.025956 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e4cebf_e0c9_4dbf_9863_ed0cc7b49669.slice/crio-09c4f7c54c6164c3120a55b68eebfebf1dbadb5866866db7e53e0571f5823d6c WatchSource:0}: Error finding container 09c4f7c54c6164c3120a55b68eebfebf1dbadb5866866db7e53e0571f5823d6c: Status 404 returned error can't find the container with id 09c4f7c54c6164c3120a55b68eebfebf1dbadb5866866db7e53e0571f5823d6c Apr 22 18:46:24.057246 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.057211 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r"] Apr 22 18:46:24.060876 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:24.060848 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod714499d9_f67c_41fd_bb2a_dcdf1721b2b9.slice/crio-c97f796de71554d3e90dc2843a9e45f45696a2d4a89bd4e50f7af5066b1997ab WatchSource:0}: Error finding container c97f796de71554d3e90dc2843a9e45f45696a2d4a89bd4e50f7af5066b1997ab: Status 404 returned error can't find the container with id c97f796de71554d3e90dc2843a9e45f45696a2d4a89bd4e50f7af5066b1997ab Apr 22 18:46:24.629813 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.628618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:24.629813 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.628674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:24.629813 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.628878 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:24.629813 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.628939 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls podName:d5b60b0a-0c13-4d95-83e6-b6d336565a6b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:28.628919657 +0000 UTC m=+40.727269589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9jp77" (UID: "d5b60b0a-0c13-4d95-83e6-b6d336565a6b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:24.629813 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.629657 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:24.629813 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.629676 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d6f96f686-fwq9s: secret "image-registry-tls" not found Apr 22 18:46:24.629813 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.629724 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls podName:a07f3c68-23f1-4479-83d7-6e71fb29694e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:28.629708377 +0000 UTC m=+40.728058299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls") pod "image-registry-7d6f96f686-fwq9s" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e") : secret "image-registry-tls" not found Apr 22 18:46:24.708629 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.708557 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" event={"ID":"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669","Type":"ContainerStarted","Data":"09c4f7c54c6164c3120a55b68eebfebf1dbadb5866866db7e53e0571f5823d6c"} Apr 22 18:46:24.710749 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.710694 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" event={"ID":"a2f5711c-2c58-47dd-a89b-ab792485adbf","Type":"ContainerStarted","Data":"65f940de24813e6d5ae7ed3f2c4c2c9d123dbe831826995dc4b29ba25e887457"} Apr 22 18:46:24.714064 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.714005 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" event={"ID":"714499d9-f67c-41fd-bb2a-dcdf1721b2b9","Type":"ContainerStarted","Data":"c97f796de71554d3e90dc2843a9e45f45696a2d4a89bd4e50f7af5066b1997ab"} Apr 22 18:46:24.723521 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.723491 2571 generic.go:358] "Generic (PLEG): container finished" podID="22922279-9d57-4b39-9e9b-25a133f37c1b" containerID="9564d99bc25fad0c089d834cb8989feddd7727267b1569c7f2207394463d97af" exitCode=0 Apr 22 18:46:24.723677 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.723579 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerDied","Data":"9564d99bc25fad0c089d834cb8989feddd7727267b1569c7f2207394463d97af"} Apr 22 18:46:24.729286 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.729257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" event={"ID":"833c11fe-7743-42be-ba1f-d3d7dbb6678c","Type":"ContainerStarted","Data":"43cf88b1ebd94888bbf969726ff5b81ea933bea97d3252f13d3b1132fcf95acb"} Apr 22 18:46:24.729286 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.729281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:24.729450 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.729329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:24.729512 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.729474 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:46:24.729512 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.729476 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:24.729630 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.729531 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:28.729513891 +0000 UTC m=+40.827863830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : secret "router-metrics-certs-default" not found Apr 22 18:46:24.729630 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.729607 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:28.729596063 +0000 UTC m=+40.827945984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : configmap references non-existent config key: service-ca.crt Apr 22 18:46:24.729744 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.729680 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:46:24.729744 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.729738 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls podName:e8cf109c-0a09-449e-87f3-54ad5a412455 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:28.729722088 +0000 UTC m=+40.828072006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fnw4t" (UID: "e8cf109c-0a09-449e-87f3-54ad5a412455") : secret "samples-operator-tls" not found Apr 22 18:46:24.733098 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.733071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bvpqn" event={"ID":"fb5bea5b-4447-44e8-8573-662eda69835e","Type":"ContainerStarted","Data":"02dc32de81b3942d9ebc38caf97d863ddc168dcede7090dc7876257be696e492"} Apr 22 18:46:24.830896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.830844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:24.831073 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:24.830985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:24.831847 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.831811 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:24.831960 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.831880 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert podName:e23ae0a9-f762-4687-bfe0-c02d473142be nodeName:}" failed. No retries permitted until 2026-04-22 18:46:28.831860943 +0000 UTC m=+40.930210865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert") pod "ingress-canary-4njjm" (UID: "e23ae0a9-f762-4687-bfe0-c02d473142be") : secret "canary-serving-cert" not found Apr 22 18:46:24.832475 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.832456 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:24.832637 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:24.832509 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls podName:f9d429a1-a93e-47db-bed8-196a5bb0f748 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:28.832494133 +0000 UTC m=+40.930844056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls") pod "dns-default-lb4jp" (UID: "f9d429a1-a93e-47db-bed8-196a5bb0f748") : secret "dns-default-metrics-tls" not found Apr 22 18:46:25.379988 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.379900 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s"] Apr 22 18:46:25.406599 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.406555 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s"] Apr 22 18:46:25.406770 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.406697 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:25.412112 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.411423 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:46:25.412112 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.411569 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fszwb\"" Apr 22 18:46:25.412112 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.411862 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:46:25.437842 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.437808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a06f033a-b65b-48a1-bac1-47daf3491118-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:25.438026 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.437868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:25.539019 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.538984 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a06f033a-b65b-48a1-bac1-47daf3491118-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:25.539187 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.539051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:25.540579 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:25.539291 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:46:25.540579 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:25.539359 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert podName:a06f033a-b65b-48a1-bac1-47daf3491118 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:26.039340159 +0000 UTC m=+38.137690083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2fc7s" (UID: "a06f033a-b65b-48a1-bac1-47daf3491118") : secret "networking-console-plugin-cert" not found Apr 22 18:46:25.540579 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.540333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a06f033a-b65b-48a1-bac1-47daf3491118-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:25.792788 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.791703 2571 generic.go:358] "Generic (PLEG): container finished" podID="22922279-9d57-4b39-9e9b-25a133f37c1b" containerID="a79403a09dd7d367975b036cc2e9557b7b991f63f3a707eca69d61103598971f" exitCode=0 Apr 22 18:46:25.792788 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:25.791758 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerDied","Data":"a79403a09dd7d367975b036cc2e9557b7b991f63f3a707eca69d61103598971f"} Apr 22 18:46:26.043323 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:26.043234 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:26.043493 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:26.043422 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:46:26.043493 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:26.043492 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert podName:a06f033a-b65b-48a1-bac1-47daf3491118 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:27.043471402 +0000 UTC m=+39.141821325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2fc7s" (UID: "a06f033a-b65b-48a1-bac1-47daf3491118") : secret "networking-console-plugin-cert" not found Apr 22 18:46:27.056801 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:27.056147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:27.056801 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:27.056323 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:46:27.056801 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:27.056403 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert podName:a06f033a-b65b-48a1-bac1-47daf3491118 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:29.056383644 +0000 UTC m=+41.154733564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2fc7s" (UID: "a06f033a-b65b-48a1-bac1-47daf3491118") : secret "networking-console-plugin-cert" not found Apr 22 18:46:28.674473 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:28.674433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:28.674934 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:28.674486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:28.674934 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.674606 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:28.674934 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.674629 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:28.674934 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.674702 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls podName:d5b60b0a-0c13-4d95-83e6-b6d336565a6b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.674681855 +0000 UTC m=+48.773031779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9jp77" (UID: "d5b60b0a-0c13-4d95-83e6-b6d336565a6b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:28.674934 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.674629 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d6f96f686-fwq9s: secret "image-registry-tls" not found Apr 22 18:46:28.674934 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.674767 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls podName:a07f3c68-23f1-4479-83d7-6e71fb29694e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.674756795 +0000 UTC m=+48.773106717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls") pod "image-registry-7d6f96f686-fwq9s" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e") : secret "image-registry-tls" not found Apr 22 18:46:28.775736 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:28.775659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:28.775736 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:28.775722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:28.775965 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:28.775763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:28.775965 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.775847 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.775823148 +0000 UTC m=+48.874173065 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : configmap references non-existent config key: service-ca.crt Apr 22 18:46:28.775965 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.775883 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:46:28.775965 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.775944 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls podName:e8cf109c-0a09-449e-87f3-54ad5a412455 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.775928474 +0000 UTC m=+48.874278392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fnw4t" (UID: "e8cf109c-0a09-449e-87f3-54ad5a412455") : secret "samples-operator-tls" not found Apr 22 18:46:28.775965 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.775883 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:46:28.776230 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.775990 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.775975403 +0000 UTC m=+48.874325321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : secret "router-metrics-certs-default" not found Apr 22 18:46:28.877289 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:28.877255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:28.877481 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.877420 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:28.877481 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:28.877440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:28.877607 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.877493 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls podName:f9d429a1-a93e-47db-bed8-196a5bb0f748 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.877472829 +0000 UTC m=+48.975822758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls") pod "dns-default-lb4jp" (UID: "f9d429a1-a93e-47db-bed8-196a5bb0f748") : secret "dns-default-metrics-tls" not found Apr 22 18:46:28.877607 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.877587 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:28.877775 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:28.877701 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert podName:e23ae0a9-f762-4687-bfe0-c02d473142be nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.87762243 +0000 UTC m=+48.975972347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert") pod "ingress-canary-4njjm" (UID: "e23ae0a9-f762-4687-bfe0-c02d473142be") : secret "canary-serving-cert" not found Apr 22 18:46:29.079080 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:29.078986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:29.079234 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:29.079147 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:46:29.079234 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:29.079230 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert podName:a06f033a-b65b-48a1-bac1-47daf3491118 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:33.079209284 +0000 UTC m=+45.177559207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2fc7s" (UID: "a06f033a-b65b-48a1-bac1-47daf3491118") : secret "networking-console-plugin-cert" not found Apr 22 18:46:33.113490 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:33.113390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:33.113929 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:33.113573 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:46:33.113929 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:33.113664 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert podName:a06f033a-b65b-48a1-bac1-47daf3491118 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:41.113638928 +0000 UTC m=+53.211988850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2fc7s" (UID: "a06f033a-b65b-48a1-bac1-47daf3491118") : secret "networking-console-plugin-cert" not found Apr 22 18:46:36.747861 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.747819 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:36.748419 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.747877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:36.748419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.747981 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:36.748419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.748005 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d6f96f686-fwq9s: secret "image-registry-tls" not found Apr 22 18:46:36.748419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.748043 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:36.748419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.748073 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls podName:a07f3c68-23f1-4479-83d7-6e71fb29694e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.74805126 +0000 UTC m=+64.846401180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls") pod "image-registry-7d6f96f686-fwq9s" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e") : secret "image-registry-tls" not found Apr 22 18:46:36.748419 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.748107 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls podName:d5b60b0a-0c13-4d95-83e6-b6d336565a6b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.748090107 +0000 UTC m=+64.846440037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9jp77" (UID: "d5b60b0a-0c13-4d95-83e6-b6d336565a6b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:46:36.821164 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.821126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" event={"ID":"714499d9-f67c-41fd-bb2a-dcdf1721b2b9","Type":"ContainerStarted","Data":"3e11d5b9173ecfad3bf13b14fbbdfbf4c6608c1e73dda6f3caf1462f5c6da80e"} Apr 22 18:46:36.822794 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.822770 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/0.log" Apr 22 18:46:36.822933 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.822813 2571 generic.go:358] "Generic (PLEG): container finished" podID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" containerID="588e75ded4c0e86196f3f131f163952bd445db7be17d4d31eb155cf34acc2dd2" exitCode=255 Apr 22 18:46:36.822933 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.822887 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" event={"ID":"12bf8f63-f32e-40cf-b529-9b5c1f6a9053","Type":"ContainerDied","Data":"588e75ded4c0e86196f3f131f163952bd445db7be17d4d31eb155cf34acc2dd2"} Apr 22 18:46:36.823178 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.823158 2571 scope.go:117] "RemoveContainer" containerID="588e75ded4c0e86196f3f131f163952bd445db7be17d4d31eb155cf34acc2dd2" Apr 22 18:46:36.829552 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.829511 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brbjp" event={"ID":"22922279-9d57-4b39-9e9b-25a133f37c1b","Type":"ContainerStarted","Data":"8a01ce9a6d1b1bdb4e729690bb6dae3485ed1d0deb933b3deb93adf6a090bb7c"} Apr 22 18:46:36.831016 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.830986 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" event={"ID":"a1739ddc-1770-4db7-a55f-9bc8b4cf2c65","Type":"ContainerStarted","Data":"ed536f311dae8bb26c7816efacb8c804ade550b563e1293dd78461bd9f12dec5"} Apr 22 18:46:36.832263 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.832241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" event={"ID":"833c11fe-7743-42be-ba1f-d3d7dbb6678c","Type":"ContainerStarted","Data":"04cce4aeaa153a7c9979e027877ea11a47bcac2f84c5d546de6ac1ff153ad019"} Apr 22 18:46:36.832475 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.832445 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:36.833582 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.833559 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" event={"ID":"38e2749f-84dd-40f0-b864-c5aaddc913a8","Type":"ContainerStarted","Data":"880f0e3cb7e6e882ebca9dbe427ae265e252894fa3f91d33a61dd2da4c2f8430"} Apr 22 18:46:36.834520 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.834501 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" Apr 22 18:46:36.835441 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.835416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tnfzj" event={"ID":"80e69f6d-8332-4923-a03d-15387087dc5a","Type":"ContainerStarted","Data":"aededa3255565b070f1befd7a503d0a7a5b427cb740d6bcc54d857cd1c88ce06"} Apr 22 18:46:36.836693 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.836658 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bvpqn" event={"ID":"fb5bea5b-4447-44e8-8573-662eda69835e","Type":"ContainerStarted","Data":"66fe13fab445a20e0ccb0b5e0e59a1ea943d553cc8d9d0ff1ca2f3489b6c32c3"} Apr 22 18:46:36.836793 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.836772 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:46:36.838013 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.837984 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" event={"ID":"69e4cebf-e0c9-4dbf-9863-ed0cc7b49669","Type":"ContainerStarted","Data":"d27447eb56395b242dbf098051e8837d81ccb0f10d3ec4ad497f763dc9e404a3"} Apr 22 18:46:36.839450 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.839315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" event={"ID":"a2f5711c-2c58-47dd-a89b-ab792485adbf","Type":"ContainerStarted","Data":"3e489049b9304d4deca47500046c2aebb33daeafc359fe10f54f03b254ba3d6c"} Apr 22 18:46:36.843053 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.843016 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" event={"ID":"166524a6-4f60-49b8-9020-7bae0d51168c","Type":"ContainerStarted","Data":"0ce205a5b238b83db67e2bfbc7b90736eafda039816dd499cd3e75dc35be6274"} Apr 22 18:46:36.848808 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.848786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:36.848916 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.848847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:36.849004 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.848988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:36.849120 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.849101 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.849083048 +0000 UTC m=+64.947432987 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : configmap references non-existent config key: service-ca.crt Apr 22 18:46:36.849120 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.849102 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:46:36.849263 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.849150 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:46:36.849263 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.849153 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls podName:e8cf109c-0a09-449e-87f3-54ad5a412455 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.849141875 +0000 UTC m=+64.947491801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fnw4t" (UID: "e8cf109c-0a09-449e-87f3-54ad5a412455") : secret "samples-operator-tls" not found Apr 22 18:46:36.849263 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.849212 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs podName:0e27f58c-14be-45e2-8336-0577f514ae76 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.849196643 +0000 UTC m=+64.947546571 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs") pod "router-default-56d9858896-wq4xk" (UID: "0e27f58c-14be-45e2-8336-0577f514ae76") : secret "router-metrics-certs-default" not found Apr 22 18:46:36.861387 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.861345 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8686b48ddc-cx2n5" podStartSLOduration=1.620457494 podStartE2EDuration="13.861329849s" podCreationTimestamp="2026-04-22 18:46:23 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.028032542 +0000 UTC m=+36.126382461" lastFinishedPulling="2026-04-22 18:46:36.268904883 +0000 UTC m=+48.367254816" observedRunningTime="2026-04-22 18:46:36.859580923 +0000 UTC m=+48.957930867" watchObservedRunningTime="2026-04-22 18:46:36.861329849 +0000 UTC m=+48.959679793" Apr 22 18:46:36.879295 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.879256 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-tnfzj" podStartSLOduration=17.239708167 podStartE2EDuration="29.879244527s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:23.566164755 +0000 UTC m=+35.664514688" lastFinishedPulling="2026-04-22 18:46:36.205701125 +0000 UTC m=+48.304051048" observedRunningTime="2026-04-22 18:46:36.878162885 +0000 UTC m=+48.976512829" watchObservedRunningTime="2026-04-22 18:46:36.879244527 +0000 UTC m=+48.977594511" Apr 22 18:46:36.895317 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.895264 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kxklx" podStartSLOduration=17.393012168 podStartE2EDuration="29.895247848s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:23.703502661 +0000 UTC m=+35.801852611" lastFinishedPulling="2026-04-22 18:46:36.205738373 +0000 UTC m=+48.304088291" observedRunningTime="2026-04-22 18:46:36.893335944 +0000 UTC m=+48.991685884" watchObservedRunningTime="2026-04-22 18:46:36.895247848 +0000 UTC m=+48.993597789" Apr 22 18:46:36.912655 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.912597 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bvpqn" podStartSLOduration=36.33692765 podStartE2EDuration="48.912580296s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:46:23.70218162 +0000 UTC m=+35.800531537" lastFinishedPulling="2026-04-22 18:46:36.277834249 +0000 UTC m=+48.376184183" observedRunningTime="2026-04-22 18:46:36.911744198 +0000 UTC m=+49.010094152" watchObservedRunningTime="2026-04-22 18:46:36.912580296 +0000 UTC m=+49.010930236" Apr 22 18:46:36.943426 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.942691 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-brbjp" podStartSLOduration=14.755044693 podStartE2EDuration="48.942672509s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.667172061 +0000 UTC m=+1.765521979" lastFinishedPulling="2026-04-22 18:46:23.854799864 +0000 UTC m=+35.953149795" observedRunningTime="2026-04-22 18:46:36.940707777 +0000 UTC m=+49.039057718" watchObservedRunningTime="2026-04-22 18:46:36.942672509 +0000 UTC m=+49.041022451" Apr 22 18:46:36.950338 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.949865 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:36.950503 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.950341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:36.952590 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.951930 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:36.952590 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.951987 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls podName:f9d429a1-a93e-47db-bed8-196a5bb0f748 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.951970101 +0000 UTC m=+65.050320024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls") pod "dns-default-lb4jp" (UID: "f9d429a1-a93e-47db-bed8-196a5bb0f748") : secret "dns-default-metrics-tls" not found Apr 22 18:46:36.953068 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.953045 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:36.953173 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:36.953100 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert podName:e23ae0a9-f762-4687-bfe0-c02d473142be nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.953084995 +0000 UTC m=+65.051434926 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert") pod "ingress-canary-4njjm" (UID: "e23ae0a9-f762-4687-bfe0-c02d473142be") : secret "canary-serving-cert" not found Apr 22 18:46:36.966175 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.961444 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" podStartSLOduration=17.263905035 podStartE2EDuration="29.961421551s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:23.571528782 +0000 UTC m=+35.669878700" lastFinishedPulling="2026-04-22 18:46:36.269045284 +0000 UTC m=+48.367395216" observedRunningTime="2026-04-22 18:46:36.961339711 +0000 UTC m=+49.059689653" watchObservedRunningTime="2026-04-22 18:46:36.961421551 +0000 UTC m=+49.059771499" Apr 22 18:46:36.980621 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.980580 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fdhpc" podStartSLOduration=18.572027657 podStartE2EDuration="29.980565972s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:23.571254386 +0000 UTC m=+35.669604303" lastFinishedPulling="2026-04-22 18:46:34.979792698 +0000 UTC m=+47.078142618" observedRunningTime="2026-04-22 18:46:36.979282206 +0000 UTC m=+49.077632148" watchObservedRunningTime="2026-04-22 18:46:36.980565972 +0000 UTC m=+49.078915922" Apr 22 18:46:36.997245 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:36.997181 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cf8c4bbff-b96bl" podStartSLOduration=1.758331986 podStartE2EDuration="13.997163916s" podCreationTimestamp="2026-04-22 18:46:23 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.028643121 +0000 UTC m=+36.126993054" lastFinishedPulling="2026-04-22 18:46:36.267475051 +0000 UTC m=+48.365824984" observedRunningTime="2026-04-22 18:46:36.995269877 +0000 UTC m=+49.093619818" watchObservedRunningTime="2026-04-22 18:46:36.997163916 +0000 UTC m=+49.095513858" Apr 22 18:46:37.353761 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.353671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:37.364585 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.364553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f793756b-ba29-4f1d-878b-9d4abe4d5ad3-original-pull-secret\") pod \"global-pull-secret-syncer-fk9jc\" (UID: \"f793756b-ba29-4f1d-878b-9d4abe4d5ad3\") " pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:37.639163 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.639089 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fk9jc" Apr 22 18:46:37.759201 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.759152 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" podStartSLOduration=18.119122532 podStartE2EDuration="30.759135479s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:23.565618608 +0000 UTC m=+35.663968530" lastFinishedPulling="2026-04-22 18:46:36.205631545 +0000 UTC m=+48.303981477" observedRunningTime="2026-04-22 18:46:37.018338678 +0000 UTC m=+49.116688620" watchObservedRunningTime="2026-04-22 18:46:37.759135479 +0000 UTC m=+49.857485428" Apr 22 18:46:37.760336 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.760311 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fk9jc"] Apr 22 18:46:37.763083 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:37.763061 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf793756b_ba29_4f1d_878b_9d4abe4d5ad3.slice/crio-f6585b910646c323cf183a2fa14c4b2b079be983a779d7747345fe0fa2272595 WatchSource:0}: Error finding container f6585b910646c323cf183a2fa14c4b2b079be983a779d7747345fe0fa2272595: Status 404 returned error can't find the container with id f6585b910646c323cf183a2fa14c4b2b079be983a779d7747345fe0fa2272595 Apr 22 18:46:37.849108 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.849077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fk9jc" event={"ID":"f793756b-ba29-4f1d-878b-9d4abe4d5ad3","Type":"ContainerStarted","Data":"f6585b910646c323cf183a2fa14c4b2b079be983a779d7747345fe0fa2272595"} Apr 22 18:46:37.850827 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.850776 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/1.log" Apr 22 18:46:37.851300 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.851274 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/0.log" Apr 22 18:46:37.851418 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.851310 2571 generic.go:358] "Generic (PLEG): container finished" podID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" containerID="b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed" exitCode=255 Apr 22 18:46:37.851418 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.851377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" event={"ID":"12bf8f63-f32e-40cf-b529-9b5c1f6a9053","Type":"ContainerDied","Data":"b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed"} Apr 22 18:46:37.851552 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.851438 2571 scope.go:117] "RemoveContainer" containerID="588e75ded4c0e86196f3f131f163952bd445db7be17d4d31eb155cf34acc2dd2" Apr 22 18:46:37.851882 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.851863 2571 scope.go:117] "RemoveContainer" containerID="b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed" Apr 22 18:46:37.852073 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:37.852052 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5h829_openshift-console-operator(12bf8f63-f32e-40cf-b529-9b5c1f6a9053)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" podUID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" Apr 22 18:46:37.889869 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.889805 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9"] Apr 22 18:46:37.911096 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.911057 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9"] Apr 22 18:46:37.911257 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.911228 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" Apr 22 18:46:37.914047 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.914025 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.914206 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.914152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2vvlf\"" Apr 22 18:46:37.914333 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.914168 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.960742 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:37.960706 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kntww\" (UniqueName: \"kubernetes.io/projected/2a94e10b-e40c-4bfa-99b0-a02b6e36263e-kube-api-access-kntww\") pod \"migrator-74bb7799d9-txhb9\" (UID: \"2a94e10b-e40c-4bfa-99b0-a02b6e36263e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" Apr 22 18:46:38.062123 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:38.061983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kntww\" (UniqueName: \"kubernetes.io/projected/2a94e10b-e40c-4bfa-99b0-a02b6e36263e-kube-api-access-kntww\") pod \"migrator-74bb7799d9-txhb9\" (UID: \"2a94e10b-e40c-4bfa-99b0-a02b6e36263e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" Apr 22 18:46:38.070214 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:38.070187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kntww\" (UniqueName: \"kubernetes.io/projected/2a94e10b-e40c-4bfa-99b0-a02b6e36263e-kube-api-access-kntww\") pod \"migrator-74bb7799d9-txhb9\" (UID: \"2a94e10b-e40c-4bfa-99b0-a02b6e36263e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" Apr 22 18:46:38.222517 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:38.222483 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" Apr 22 18:46:38.363467 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:38.363416 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9"] Apr 22 18:46:38.367109 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:38.367075 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a94e10b_e40c_4bfa_99b0_a02b6e36263e.slice/crio-4a8c205cde22d0c2b8668c25cb32f4216ca7a89cc8563bc997b7812e0366bba3 WatchSource:0}: Error finding container 4a8c205cde22d0c2b8668c25cb32f4216ca7a89cc8563bc997b7812e0366bba3: Status 404 returned error can't find the container with id 4a8c205cde22d0c2b8668c25cb32f4216ca7a89cc8563bc997b7812e0366bba3 Apr 22 18:46:38.856603 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:38.856578 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/1.log" Apr 22 18:46:38.857014 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:38.856988 2571 scope.go:117] "RemoveContainer" containerID="b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed" Apr 22 18:46:38.857280 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:38.857228 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5h829_openshift-console-operator(12bf8f63-f32e-40cf-b529-9b5c1f6a9053)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" podUID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" Apr 22 18:46:38.858192 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:38.858162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" event={"ID":"2a94e10b-e40c-4bfa-99b0-a02b6e36263e","Type":"ContainerStarted","Data":"4a8c205cde22d0c2b8668c25cb32f4216ca7a89cc8563bc997b7812e0366bba3"} Apr 22 18:46:40.866422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:40.866372 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" event={"ID":"714499d9-f67c-41fd-bb2a-dcdf1721b2b9","Type":"ContainerStarted","Data":"7eba6fd42b41030f8932bf229a4c3f43313db3c5d21db1a46e747965f0341915"} Apr 22 18:46:40.866422 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:40.866420 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" event={"ID":"714499d9-f67c-41fd-bb2a-dcdf1721b2b9","Type":"ContainerStarted","Data":"01765bd0fedeb1919a62bbb41aa403fb1543af0e400616e556a1349ddc7b5b75"} Apr 22 18:46:40.867949 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:40.867921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" event={"ID":"2a94e10b-e40c-4bfa-99b0-a02b6e36263e","Type":"ContainerStarted","Data":"213aa76990b0e865cf80b16560ab6cbce5d1a8992619f9559b5926b7c1cdf5c8"} Apr 22 18:46:40.868051 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:40.867957 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" event={"ID":"2a94e10b-e40c-4bfa-99b0-a02b6e36263e","Type":"ContainerStarted","Data":"dc23edb2a4d110a27375ec0eaecf39a7a4fedc19648e8dbdeee23e0cc2081419"} Apr 22 18:46:40.899618 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:40.899580 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" podStartSLOduration=2.017234705 podStartE2EDuration="17.899567948s" podCreationTimestamp="2026-04-22 18:46:23 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.062655219 +0000 UTC m=+36.161005137" lastFinishedPulling="2026-04-22 18:46:39.94498845 +0000 UTC m=+52.043338380" observedRunningTime="2026-04-22 18:46:40.898932922 +0000 UTC m=+52.997282865" watchObservedRunningTime="2026-04-22 18:46:40.899567948 +0000 UTC m=+52.997917886" Apr 22 18:46:40.925098 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:40.925056 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-txhb9" podStartSLOduration=2.351855035 podStartE2EDuration="3.925044255s" podCreationTimestamp="2026-04-22 18:46:37 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.369165084 +0000 UTC m=+50.467515008" lastFinishedPulling="2026-04-22 18:46:39.942354298 +0000 UTC m=+52.040704228" observedRunningTime="2026-04-22 18:46:40.924120786 +0000 UTC m=+53.022470726" watchObservedRunningTime="2026-04-22 18:46:40.925044255 +0000 UTC m=+53.023394194" Apr 22 18:46:41.195977 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:41.195937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:41.196165 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:41.196122 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:46:41.196228 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:41.196209 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert podName:a06f033a-b65b-48a1-bac1-47daf3491118 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.196190938 +0000 UTC m=+69.294540856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2fc7s" (UID: "a06f033a-b65b-48a1-bac1-47daf3491118") : secret "networking-console-plugin-cert" not found Apr 22 18:46:41.299646 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:41.299616 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:41.300028 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:41.300006 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:46:41.300118 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:41.300062 2571 scope.go:117] "RemoveContainer" containerID="b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed" Apr 22 18:46:41.300264 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:41.300238 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5h829_openshift-console-operator(12bf8f63-f32e-40cf-b529-9b5c1f6a9053)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" podUID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" Apr 22 18:46:41.871037 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:41.870998 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzzp6_0c75569b-ad2e-4296-8dad-807a8c913df1/dns-node-resolver/0.log" Apr 22 18:46:41.871509 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:41.871249 2571 scope.go:117] "RemoveContainer" containerID="b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed" Apr 22 18:46:41.871509 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:41.871475 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5h829_openshift-console-operator(12bf8f63-f32e-40cf-b529-9b5c1f6a9053)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" podUID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" Apr 22 18:46:42.875059 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:42.874965 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fk9jc" event={"ID":"f793756b-ba29-4f1d-878b-9d4abe4d5ad3","Type":"ContainerStarted","Data":"bd77e163dd1fcddad86002e8e0e94224972b4ede3330a54929869f6bdfaa275f"} Apr 22 18:46:42.894087 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:42.894037 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fk9jc" podStartSLOduration=33.19320047 podStartE2EDuration="37.894023986s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:37.764829637 +0000 UTC m=+49.863179554" lastFinishedPulling="2026-04-22 18:46:42.465653149 +0000 UTC m=+54.564003070" observedRunningTime="2026-04-22 18:46:42.893651095 +0000 UTC m=+54.992001035" watchObservedRunningTime="2026-04-22 18:46:42.894023986 +0000 UTC m=+54.992373926" Apr 22 18:46:43.071433 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:43.071404 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rvx2v_e63b9430-ddb4-4626-8019-7bfa90ffac77/node-ca/0.log" Apr 22 18:46:43.872137 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:43.872110 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-txhb9_2a94e10b-e40c-4bfa-99b0-a02b6e36263e/migrator/0.log" Apr 22 18:46:44.073781 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:44.073753 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-txhb9_2a94e10b-e40c-4bfa-99b0-a02b6e36263e/graceful-termination/0.log" Apr 22 18:46:44.272559 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:44.272501 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qwsv8_166524a6-4f60-49b8-9020-7bae0d51168c/kube-storage-version-migrator-operator/0.log" Apr 22 18:46:46.690160 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:46.690126 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmbf4" Apr 22 18:46:52.422952 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.422910 2571 scope.go:117] "RemoveContainer" containerID="b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed" Apr 22 18:46:52.803209 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.803174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:52.803209 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.803216 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:52.805604 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.805576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5b60b0a-0c13-4d95-83e6-b6d336565a6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9jp77\" (UID: \"d5b60b0a-0c13-4d95-83e6-b6d336565a6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:52.805604 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.805595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"image-registry-7d6f96f686-fwq9s\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:52.902664 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.902636 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 18:46:52.902997 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.902980 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/1.log" Apr 22 18:46:52.903055 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.903016 2571 generic.go:358] "Generic (PLEG): container finished" podID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" containerID="2710cc48e4a6621bf230839f506bc1be34918986a37c9d03408cd39178e5c767" exitCode=255 Apr 22 18:46:52.903091 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.903080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" event={"ID":"12bf8f63-f32e-40cf-b529-9b5c1f6a9053","Type":"ContainerDied","Data":"2710cc48e4a6621bf230839f506bc1be34918986a37c9d03408cd39178e5c767"} Apr 22 18:46:52.903128 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.903118 2571 scope.go:117] "RemoveContainer" containerID="b586a58f42789c8b58998e92f05a9600b1e6cac0d3cf3bd9bfe1fcc15151a8ed" Apr 22 18:46:52.903468 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.903449 2571 scope.go:117] "RemoveContainer" containerID="2710cc48e4a6621bf230839f506bc1be34918986a37c9d03408cd39178e5c767" Apr 22 18:46:52.907625 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:46:52.904207 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-5h829_openshift-console-operator(12bf8f63-f32e-40cf-b529-9b5c1f6a9053)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" podUID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" Apr 22 18:46:52.907625 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.904477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:52.907625 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.904567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:52.907852 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.907829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:52.908244 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.908210 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e27f58c-14be-45e2-8336-0577f514ae76-metrics-certs\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:52.908778 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.908695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e27f58c-14be-45e2-8336-0577f514ae76-service-ca-bundle\") pod \"router-default-56d9858896-wq4xk\" (UID: \"0e27f58c-14be-45e2-8336-0577f514ae76\") " pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:52.910225 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.910203 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8cf109c-0a09-449e-87f3-54ad5a412455-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fnw4t\" (UID: \"e8cf109c-0a09-449e-87f3-54ad5a412455\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:52.957280 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.957258 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mvb7n\"" Apr 22 18:46:52.965433 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.965415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:52.972818 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.972800 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-lwgbm\"" Apr 22 18:46:52.980938 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:52.980919 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" Apr 22 18:46:53.008973 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.008939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:53.009151 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.009061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:53.012513 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.012465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d429a1-a93e-47db-bed8-196a5bb0f748-metrics-tls\") pod \"dns-default-lb4jp\" (UID: \"f9d429a1-a93e-47db-bed8-196a5bb0f748\") " pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:53.012655 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.012556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e23ae0a9-f762-4687-bfe0-c02d473142be-cert\") pod \"ingress-canary-4njjm\" (UID: \"e23ae0a9-f762-4687-bfe0-c02d473142be\") " pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:53.026779 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.026519 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-47626\"" Apr 22 18:46:53.035011 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.034906 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:53.081447 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.081414 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cktvr\"" Apr 22 18:46:53.089229 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.089184 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" Apr 22 18:46:53.096980 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.096953 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d6f96f686-fwq9s"] Apr 22 18:46:53.098886 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:53.098848 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda07f3c68_23f1_4479_83d7_6e71fb29694e.slice/crio-da878ea5d4c3c06f81b677c17e431b8556773d88a9b8b173475b5cb00470d6ff WatchSource:0}: Error finding container da878ea5d4c3c06f81b677c17e431b8556773d88a9b8b173475b5cb00470d6ff: Status 404 returned error can't find the container with id da878ea5d4c3c06f81b677c17e431b8556773d88a9b8b173475b5cb00470d6ff Apr 22 18:46:53.116587 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.116554 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77"] Apr 22 18:46:53.119570 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.119524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnqbz\"" Apr 22 18:46:53.119857 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:53.119812 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5b60b0a_0c13_4d95_83e6_b6d336565a6b.slice/crio-81facd7dbe16e9217eb2ee2ecd872976c4048027000b32a4dad946ce3f5b68c4 WatchSource:0}: Error finding container 81facd7dbe16e9217eb2ee2ecd872976c4048027000b32a4dad946ce3f5b68c4: Status 404 returned error can't find the container with id 81facd7dbe16e9217eb2ee2ecd872976c4048027000b32a4dad946ce3f5b68c4 Apr 22 18:46:53.127793 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.127771 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:53.144795 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.144385 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5j5kr\"" Apr 22 18:46:53.152953 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.150985 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4njjm" Apr 22 18:46:53.179007 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.178110 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56d9858896-wq4xk"] Apr 22 18:46:53.185525 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:53.185474 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e27f58c_14be_45e2_8336_0577f514ae76.slice/crio-0872b0c10c235e6bfe8a11062d11ab3e6a5a950e30f380350e3727857630aaa9 WatchSource:0}: Error finding container 0872b0c10c235e6bfe8a11062d11ab3e6a5a950e30f380350e3727857630aaa9: Status 404 returned error can't find the container with id 0872b0c10c235e6bfe8a11062d11ab3e6a5a950e30f380350e3727857630aaa9 Apr 22 18:46:53.212136 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.211458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:53.216771 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.214791 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:46:53.230660 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.227762 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d458b0-40cd-4eaf-8dbf-220566ae55ef-metrics-certs\") pod \"network-metrics-daemon-v6b2n\" (UID: \"e0d458b0-40cd-4eaf-8dbf-220566ae55ef\") " pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:53.246508 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.246477 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t"] Apr 22 18:46:53.256328 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.256021 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-scwsl\"" Apr 22 18:46:53.263786 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.263418 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v6b2n" Apr 22 18:46:53.295600 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.295572 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lb4jp"] Apr 22 18:46:53.298218 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:53.298136 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d429a1_a93e_47db_bed8_196a5bb0f748.slice/crio-3961ae020ee3e9057448cc09b9dd7aa3d0336412ced9f9ff35ff5c4fa111c227 WatchSource:0}: Error finding container 3961ae020ee3e9057448cc09b9dd7aa3d0336412ced9f9ff35ff5c4fa111c227: Status 404 returned error can't find the container with id 3961ae020ee3e9057448cc09b9dd7aa3d0336412ced9f9ff35ff5c4fa111c227 Apr 22 18:46:53.314005 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.313976 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4njjm"] Apr 22 18:46:53.317300 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:53.317264 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23ae0a9_f762_4687_bfe0_c02d473142be.slice/crio-ed407c3631a784225fbf97d6539c03dcee21beceebe6a348f478f3d09f958145 WatchSource:0}: Error finding container ed407c3631a784225fbf97d6539c03dcee21beceebe6a348f478f3d09f958145: Status 404 returned error can't find the container with id ed407c3631a784225fbf97d6539c03dcee21beceebe6a348f478f3d09f958145 Apr 22 18:46:53.399246 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.399197 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v6b2n"] Apr 22 18:46:53.402081 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:53.402053 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d458b0_40cd_4eaf_8dbf_220566ae55ef.slice/crio-7cf99ab162501bca14f2535fcb37c3780bc0e3fb5398d979b23414f82ebb1624 WatchSource:0}: Error finding container 7cf99ab162501bca14f2535fcb37c3780bc0e3fb5398d979b23414f82ebb1624: Status 404 returned error can't find the container with id 7cf99ab162501bca14f2535fcb37c3780bc0e3fb5398d979b23414f82ebb1624 Apr 22 18:46:53.910394 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.910323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v6b2n" event={"ID":"e0d458b0-40cd-4eaf-8dbf-220566ae55ef","Type":"ContainerStarted","Data":"7cf99ab162501bca14f2535fcb37c3780bc0e3fb5398d979b23414f82ebb1624"} Apr 22 18:46:53.911770 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.911710 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" event={"ID":"e8cf109c-0a09-449e-87f3-54ad5a412455","Type":"ContainerStarted","Data":"a1003914fa68065bdf34cbfcac9efe4e764190125d1a7e8030164a316b9b7b37"} Apr 22 18:46:53.913845 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.913819 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4njjm" event={"ID":"e23ae0a9-f762-4687-bfe0-c02d473142be","Type":"ContainerStarted","Data":"ed407c3631a784225fbf97d6539c03dcee21beceebe6a348f478f3d09f958145"} Apr 22 18:46:53.916467 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.916442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lb4jp" event={"ID":"f9d429a1-a93e-47db-bed8-196a5bb0f748","Type":"ContainerStarted","Data":"3961ae020ee3e9057448cc09b9dd7aa3d0336412ced9f9ff35ff5c4fa111c227"} Apr 22 18:46:53.918207 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.918170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" event={"ID":"a07f3c68-23f1-4479-83d7-6e71fb29694e","Type":"ContainerStarted","Data":"7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e"} Apr 22 18:46:53.918207 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.918200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" event={"ID":"a07f3c68-23f1-4479-83d7-6e71fb29694e","Type":"ContainerStarted","Data":"da878ea5d4c3c06f81b677c17e431b8556773d88a9b8b173475b5cb00470d6ff"} Apr 22 18:46:53.918614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.918598 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:46:53.920606 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.920577 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56d9858896-wq4xk" event={"ID":"0e27f58c-14be-45e2-8336-0577f514ae76","Type":"ContainerStarted","Data":"598c9087b862a103ff9e22f46cd8d10e34634348188204faa69b1e954a6a0ffc"} Apr 22 18:46:53.920606 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.920607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56d9858896-wq4xk" event={"ID":"0e27f58c-14be-45e2-8336-0577f514ae76","Type":"ContainerStarted","Data":"0872b0c10c235e6bfe8a11062d11ab3e6a5a950e30f380350e3727857630aaa9"} Apr 22 18:46:53.922765 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.922626 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 18:46:53.925216 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.925122 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" event={"ID":"d5b60b0a-0c13-4d95-83e6-b6d336565a6b","Type":"ContainerStarted","Data":"81facd7dbe16e9217eb2ee2ecd872976c4048027000b32a4dad946ce3f5b68c4"} Apr 22 18:46:53.944296 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.944240 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" podStartSLOduration=65.944221061 podStartE2EDuration="1m5.944221061s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:53.942853519 +0000 UTC m=+66.041203460" watchObservedRunningTime="2026-04-22 18:46:53.944221061 +0000 UTC m=+66.042571005" Apr 22 18:46:53.963703 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:53.963309 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-56d9858896-wq4xk" podStartSLOduration=46.963291117 podStartE2EDuration="46.963291117s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:53.96169164 +0000 UTC m=+66.060041588" watchObservedRunningTime="2026-04-22 18:46:53.963291117 +0000 UTC m=+66.061641052" Apr 22 18:46:54.036067 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:54.036033 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:54.039231 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:54.039208 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:54.928567 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:54.928501 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:54.930217 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:54.930195 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-56d9858896-wq4xk" Apr 22 18:46:57.251442 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:57.251392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:57.253977 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:57.253954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a06f033a-b65b-48a1-bac1-47daf3491118-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2fc7s\" (UID: \"a06f033a-b65b-48a1-bac1-47daf3491118\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:57.525604 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:57.525516 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fszwb\"" Apr 22 18:46:57.533059 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:57.533036 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" Apr 22 18:46:58.260210 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.259830 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s"] Apr 22 18:46:58.735067 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.735039 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-42q2k"] Apr 22 18:46:58.750295 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.750268 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.753650 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.753627 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:46:58.754020 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.753979 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gf99c\"" Apr 22 18:46:58.754226 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.754162 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:46:58.755702 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.755643 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-42q2k"] Apr 22 18:46:58.763418 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.763396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-data-volume\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.763564 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.763433 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.763564 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.763456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.763564 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.763527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z85d7\" (UniqueName: \"kubernetes.io/projected/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-kube-api-access-z85d7\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.763792 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.763765 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-crio-socket\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.864969 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.864927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.865149 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.864983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.865149 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.865111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z85d7\" (UniqueName: \"kubernetes.io/projected/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-kube-api-access-z85d7\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.865262 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.865144 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-crio-socket\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.865313 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.865286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-data-volume\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.865363 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.865329 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-crio-socket\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.865602 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.865576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-data-volume\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.865731 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.865642 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.867784 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.867761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.874964 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.874916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z85d7\" (UniqueName: \"kubernetes.io/projected/a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2-kube-api-access-z85d7\") pod \"insights-runtime-extractor-42q2k\" (UID: \"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2\") " pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:58.941968 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.941929 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" event={"ID":"a06f033a-b65b-48a1-bac1-47daf3491118","Type":"ContainerStarted","Data":"11c34a68d3f9bdf91e208df6b2ede08cb8ca7e181233e1c70b3e29c9ca964047"} Apr 22 18:46:58.943189 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.943165 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" event={"ID":"d5b60b0a-0c13-4d95-83e6-b6d336565a6b","Type":"ContainerStarted","Data":"fa7300b096fb99dd2bc018aca11e7dcb77b93c564aaf7bdf2c3280f65e141f81"} Apr 22 18:46:58.944722 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.944674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v6b2n" event={"ID":"e0d458b0-40cd-4eaf-8dbf-220566ae55ef","Type":"ContainerStarted","Data":"b3531f84b68c563c7547e98f9631793875e241f6588e92ace4de1aefc2db0281"} Apr 22 18:46:58.944722 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.944702 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v6b2n" event={"ID":"e0d458b0-40cd-4eaf-8dbf-220566ae55ef","Type":"ContainerStarted","Data":"91e7e03e0c654436c43060b33ab0e7f6a1a9209e35c36d6b59d62575a7c8fcb6"} Apr 22 18:46:58.946098 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.946074 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" event={"ID":"e8cf109c-0a09-449e-87f3-54ad5a412455","Type":"ContainerStarted","Data":"3c5cde1d77e89066b53444ee7f91051dd017b7642dd8aaaa667ba3bfd2a66a3d"} Apr 22 18:46:58.946174 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.946104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" event={"ID":"e8cf109c-0a09-449e-87f3-54ad5a412455","Type":"ContainerStarted","Data":"9b8767d817d0311b88c59ba6ca9b427e029f11cb5a3ebfff15dd5ecc7206be7c"} Apr 22 18:46:58.947117 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.947099 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4njjm" event={"ID":"e23ae0a9-f762-4687-bfe0-c02d473142be","Type":"ContainerStarted","Data":"82e5b28b376a79bc8b103429ea920c3b14f81769c4cfad109174e072a8df5919"} Apr 22 18:46:58.948582 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.948561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lb4jp" event={"ID":"f9d429a1-a93e-47db-bed8-196a5bb0f748","Type":"ContainerStarted","Data":"394408a097723232bef26f71d519e7ccf95c23caa093e44dd47dbe09e932b743"} Apr 22 18:46:58.948671 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.948587 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lb4jp" event={"ID":"f9d429a1-a93e-47db-bed8-196a5bb0f748","Type":"ContainerStarted","Data":"d201bfdbf88a3f57e47d1863f7b28fac1f66f279a7f247d69c1f5bbedc0e0c3a"} Apr 22 18:46:58.948727 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.948715 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lb4jp" Apr 22 18:46:58.969685 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.969638 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9jp77" podStartSLOduration=47.176962088 podStartE2EDuration="51.969622628s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:53.121760596 +0000 UTC m=+65.220110513" lastFinishedPulling="2026-04-22 18:46:57.914421131 +0000 UTC m=+70.012771053" observedRunningTime="2026-04-22 18:46:58.967722807 +0000 UTC m=+71.066072750" watchObservedRunningTime="2026-04-22 18:46:58.969622628 +0000 UTC m=+71.067972569" Apr 22 18:46:58.987175 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:58.987094 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fnw4t" podStartSLOduration=47.360246548 podStartE2EDuration="51.987081973s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:53.289456514 +0000 UTC m=+65.387806433" lastFinishedPulling="2026-04-22 18:46:57.916291925 +0000 UTC m=+70.014641858" observedRunningTime="2026-04-22 18:46:58.986653221 +0000 UTC m=+71.085003163" watchObservedRunningTime="2026-04-22 18:46:58.987081973 +0000 UTC m=+71.085431961" Apr 22 18:46:59.006583 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.006521 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4njjm" podStartSLOduration=34.411706145 podStartE2EDuration="39.006504261s" podCreationTimestamp="2026-04-22 18:46:20 +0000 UTC" firstStartedPulling="2026-04-22 18:46:53.319704851 +0000 UTC m=+65.418054769" lastFinishedPulling="2026-04-22 18:46:57.914502957 +0000 UTC m=+70.012852885" observedRunningTime="2026-04-22 18:46:59.005818933 +0000 UTC m=+71.104168909" watchObservedRunningTime="2026-04-22 18:46:59.006504261 +0000 UTC m=+71.104854201" Apr 22 18:46:59.022023 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.021966 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v6b2n" podStartSLOduration=66.51203804 podStartE2EDuration="1m11.021952792s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:46:53.404507037 +0000 UTC m=+65.502856961" lastFinishedPulling="2026-04-22 18:46:57.914421791 +0000 UTC m=+70.012771713" observedRunningTime="2026-04-22 18:46:59.021500024 +0000 UTC m=+71.119849965" watchObservedRunningTime="2026-04-22 18:46:59.021952792 +0000 UTC m=+71.120302732" Apr 22 18:46:59.040239 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.040180 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lb4jp" podStartSLOduration=34.425349598 podStartE2EDuration="39.04016045s" podCreationTimestamp="2026-04-22 18:46:20 +0000 UTC" firstStartedPulling="2026-04-22 18:46:53.300010796 +0000 UTC m=+65.398360714" lastFinishedPulling="2026-04-22 18:46:57.914821647 +0000 UTC m=+70.013171566" observedRunningTime="2026-04-22 18:46:59.039261136 +0000 UTC m=+71.137611077" watchObservedRunningTime="2026-04-22 18:46:59.04016045 +0000 UTC m=+71.138510392" Apr 22 18:46:59.062991 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.062962 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-42q2k" Apr 22 18:46:59.408157 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.408138 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-42q2k"] Apr 22 18:46:59.528056 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:46:59.528017 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7a0f4c6_c0ab_46f3_ad8c_e576f17ea7d2.slice/crio-bbf23b056e0a11c80a0d404f2db72fbf286c2601fd3ce8395e0333d5624276ed WatchSource:0}: Error finding container bbf23b056e0a11c80a0d404f2db72fbf286c2601fd3ce8395e0333d5624276ed: Status 404 returned error can't find the container with id bbf23b056e0a11c80a0d404f2db72fbf286c2601fd3ce8395e0333d5624276ed Apr 22 18:46:59.952429 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.952399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" event={"ID":"a06f033a-b65b-48a1-bac1-47daf3491118","Type":"ContainerStarted","Data":"af1ec06b9f87423d301d25a774cc5162f218e2e86983abcae8c453f00448d7fe"} Apr 22 18:46:59.953809 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.953785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-42q2k" event={"ID":"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2","Type":"ContainerStarted","Data":"e03cc1f9cb2aa6c0a8b16c016bb3ae64b634cf183475daaef8e9673f1238d495"} Apr 22 18:46:59.953915 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.953818 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-42q2k" event={"ID":"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2","Type":"ContainerStarted","Data":"bbf23b056e0a11c80a0d404f2db72fbf286c2601fd3ce8395e0333d5624276ed"} Apr 22 18:46:59.968415 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:46:59.968367 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2fc7s" podStartSLOduration=33.681278407 podStartE2EDuration="34.968352768s" podCreationTimestamp="2026-04-22 18:46:25 +0000 UTC" firstStartedPulling="2026-04-22 18:46:58.271134084 +0000 UTC m=+70.369484016" lastFinishedPulling="2026-04-22 18:46:59.55820846 +0000 UTC m=+71.656558377" observedRunningTime="2026-04-22 18:46:59.967640582 +0000 UTC m=+72.065990535" watchObservedRunningTime="2026-04-22 18:46:59.968352768 +0000 UTC m=+72.066702708" Apr 22 18:47:00.959465 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:00.959421 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-42q2k" event={"ID":"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2","Type":"ContainerStarted","Data":"1b7c95672098204afa5c8dfe86e1c32e52e9a88873f33148e29cca3ed8cdbc3c"} Apr 22 18:47:01.300111 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:01.300016 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:47:01.300111 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:01.300067 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:47:01.300485 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:01.300466 2571 scope.go:117] "RemoveContainer" containerID="2710cc48e4a6621bf230839f506bc1be34918986a37c9d03408cd39178e5c767" Apr 22 18:47:01.300726 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:47:01.300706 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-5h829_openshift-console-operator(12bf8f63-f32e-40cf-b529-9b5c1f6a9053)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" podUID="12bf8f63-f32e-40cf-b529-9b5c1f6a9053" Apr 22 18:47:01.963970 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:01.963935 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-42q2k" event={"ID":"a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2","Type":"ContainerStarted","Data":"d2516bd34e54314a143755dfb420ac5ba1da332d1aa335c8fde9f901fdfb01d3"} Apr 22 18:47:01.984084 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:01.984036 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-42q2k" podStartSLOduration=2.123743279 podStartE2EDuration="3.984020208s" podCreationTimestamp="2026-04-22 18:46:58 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.680224908 +0000 UTC m=+71.778574825" lastFinishedPulling="2026-04-22 18:47:01.540501833 +0000 UTC m=+73.638851754" observedRunningTime="2026-04-22 18:47:01.982986432 +0000 UTC m=+74.081336374" watchObservedRunningTime="2026-04-22 18:47:01.984020208 +0000 UTC m=+74.082370148" Apr 22 18:47:06.954128 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:06.954088 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fclvv"] Apr 22 18:47:06.957739 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:06.957716 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:06.960453 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:06.960429 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:47:06.960569 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:06.960449 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:47:06.960569 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:06.960452 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7zc6s\"" Apr 22 18:47:06.961869 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:06.961853 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:47:06.961957 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:06.961902 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:47:07.023373 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-wtmp\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023373 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e623106d-3302-4ee7-b9f7-30710433e6c8-metrics-client-ca\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023621 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-textfile\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023621 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023468 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023621 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lrb\" (UniqueName: \"kubernetes.io/projected/e623106d-3302-4ee7-b9f7-30710433e6c8-kube-api-access-68lrb\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023621 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023579 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-tls\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023621 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-sys\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023845 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023651 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.023845 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.023679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-root\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124216 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124182 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-sys\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124352 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124226 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124352 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-sys\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124352 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-root\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124463 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-wtmp\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124463 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124401 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-root\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124463 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e623106d-3302-4ee7-b9f7-30710433e6c8-metrics-client-ca\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124463 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-textfile\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124463 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124723 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68lrb\" (UniqueName: \"kubernetes.io/projected/e623106d-3302-4ee7-b9f7-30710433e6c8-kube-api-access-68lrb\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124723 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124488 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-wtmp\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124723 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-tls\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.124723 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:47:07.124681 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:47:07.124911 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:47:07.124737 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-tls podName:e623106d-3302-4ee7-b9f7-30710433e6c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:07.624720158 +0000 UTC m=+79.723070077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-tls") pod "node-exporter-fclvv" (UID: "e623106d-3302-4ee7-b9f7-30710433e6c8") : secret "node-exporter-tls" not found Apr 22 18:47:07.124911 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.124841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-textfile\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.125071 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.125052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e623106d-3302-4ee7-b9f7-30710433e6c8-metrics-client-ca\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.125194 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.125169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.126636 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.126621 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.134307 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.134286 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lrb\" (UniqueName: \"kubernetes.io/projected/e623106d-3302-4ee7-b9f7-30710433e6c8-kube-api-access-68lrb\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.629410 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.629377 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-tls\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.631691 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.631669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e623106d-3302-4ee7-b9f7-30710433e6c8-node-exporter-tls\") pod \"node-exporter-fclvv\" (UID: \"e623106d-3302-4ee7-b9f7-30710433e6c8\") " pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.855610 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.855493 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bvpqn" Apr 22 18:47:07.867618 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.867592 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fclvv" Apr 22 18:47:07.876262 ip-10-0-133-42 kubenswrapper[2571]: W0422 18:47:07.876238 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode623106d_3302_4ee7_b9f7_30710433e6c8.slice/crio-a1fb51b9894116c5a8bf6b552414cd81dd6b39602849658ba5c97b7ff1df82b4 WatchSource:0}: Error finding container a1fb51b9894116c5a8bf6b552414cd81dd6b39602849658ba5c97b7ff1df82b4: Status 404 returned error can't find the container with id a1fb51b9894116c5a8bf6b552414cd81dd6b39602849658ba5c97b7ff1df82b4 Apr 22 18:47:07.982241 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:07.982203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fclvv" event={"ID":"e623106d-3302-4ee7-b9f7-30710433e6c8","Type":"ContainerStarted","Data":"a1fb51b9894116c5a8bf6b552414cd81dd6b39602849658ba5c97b7ff1df82b4"} Apr 22 18:47:08.956397 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:08.956370 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lb4jp" Apr 22 18:47:08.987245 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:08.987161 2571 generic.go:358] "Generic (PLEG): container finished" podID="e623106d-3302-4ee7-b9f7-30710433e6c8" containerID="a3108f3398de1708bbdc66777d7e942f9bf2901798454b93120c77c68ecdbfe3" exitCode=0 Apr 22 18:47:08.987704 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:08.987240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fclvv" event={"ID":"e623106d-3302-4ee7-b9f7-30710433e6c8","Type":"ContainerDied","Data":"a3108f3398de1708bbdc66777d7e942f9bf2901798454b93120c77c68ecdbfe3"} Apr 22 18:47:09.991685 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:09.991647 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fclvv" event={"ID":"e623106d-3302-4ee7-b9f7-30710433e6c8","Type":"ContainerStarted","Data":"6f28d7912405f2184b09dab67ccb45bce91d09506bb4a84aba12ae42019843cf"} Apr 22 18:47:09.991685 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:09.991684 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fclvv" event={"ID":"e623106d-3302-4ee7-b9f7-30710433e6c8","Type":"ContainerStarted","Data":"dcc97a038bbc256936512765eb75af3fa40d8d9a8d629db4010ac646e7b5ba91"} Apr 22 18:47:10.011795 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:10.011744 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fclvv" podStartSLOduration=3.162440442 podStartE2EDuration="4.011729222s" podCreationTimestamp="2026-04-22 18:47:06 +0000 UTC" firstStartedPulling="2026-04-22 18:47:07.877941268 +0000 UTC m=+79.976291198" lastFinishedPulling="2026-04-22 18:47:08.727230046 +0000 UTC m=+80.825579978" observedRunningTime="2026-04-22 18:47:10.010121305 +0000 UTC m=+82.108471256" watchObservedRunningTime="2026-04-22 18:47:10.011729222 +0000 UTC m=+82.110079161" Apr 22 18:47:12.969337 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:12.969303 2571 patch_prober.go:28] interesting pod/image-registry-7d6f96f686-fwq9s container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:47:12.969702 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:12.969360 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" podUID="a07f3c68-23f1-4479-83d7-6e71fb29694e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:47:13.423004 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:13.422928 2571 scope.go:117] "RemoveContainer" containerID="2710cc48e4a6621bf230839f506bc1be34918986a37c9d03408cd39178e5c767" Apr 22 18:47:14.004761 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:14.004733 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 18:47:14.005148 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:14.004811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" event={"ID":"12bf8f63-f32e-40cf-b529-9b5c1f6a9053","Type":"ContainerStarted","Data":"d6af0c9b7e47ba216abad43e78f6a84aa467aa0cd1fc0ee18a02d20ade104443"} Apr 22 18:47:14.005148 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:14.005107 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:47:14.016077 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:14.016055 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" Apr 22 18:47:14.027176 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:14.027116 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-5h829" podStartSLOduration=54.388830316 podStartE2EDuration="1m7.027100991s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:23.567364721 +0000 UTC m=+35.665714646" lastFinishedPulling="2026-04-22 18:46:36.205635397 +0000 UTC m=+48.303985321" observedRunningTime="2026-04-22 18:47:14.025776423 +0000 UTC m=+86.124126364" watchObservedRunningTime="2026-04-22 18:47:14.027100991 +0000 UTC m=+86.125450933" Apr 22 18:47:14.932328 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:14.932302 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:47:20.531440 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:20.531406 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7d6f96f686-fwq9s"] Apr 22 18:47:45.550855 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.550810 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" podUID="a07f3c68-23f1-4479-83d7-6e71fb29694e" containerName="registry" containerID="cri-o://7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e" gracePeriod=30 Apr 22 18:47:45.789233 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.789204 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:47:45.956874 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.956847 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957036 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.956897 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a07f3c68-23f1-4479-83d7-6e71fb29694e-ca-trust-extracted\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957036 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.956917 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-installation-pull-secrets\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957036 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.956949 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp4sz\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-kube-api-access-bp4sz\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957036 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.956968 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-trusted-ca\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957036 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.956988 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-image-registry-private-configuration\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957275 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.957056 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-certificates\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957275 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.957079 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-bound-sa-token\") pod \"a07f3c68-23f1-4479-83d7-6e71fb29694e\" (UID: \"a07f3c68-23f1-4479-83d7-6e71fb29694e\") " Apr 22 18:47:45.957710 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.957675 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:45.957949 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.957924 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:45.959414 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.959385 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:45.959615 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.959578 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:45.959721 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.959686 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-kube-api-access-bp4sz" (OuterVolumeSpecName: "kube-api-access-bp4sz") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "kube-api-access-bp4sz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:45.959780 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.959757 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:45.959834 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.959795 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:45.965931 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:45.965906 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07f3c68-23f1-4479-83d7-6e71fb29694e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a07f3c68-23f1-4479-83d7-6e71fb29694e" (UID: "a07f3c68-23f1-4479-83d7-6e71fb29694e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:47:46.058445 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058414 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-tls\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.058445 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058441 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a07f3c68-23f1-4479-83d7-6e71fb29694e-ca-trust-extracted\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.058645 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058454 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-installation-pull-secrets\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.058645 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058463 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bp4sz\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-kube-api-access-bp4sz\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.058645 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058473 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-trusted-ca\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.058645 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058482 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a07f3c68-23f1-4479-83d7-6e71fb29694e-image-registry-private-configuration\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.058645 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058498 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a07f3c68-23f1-4479-83d7-6e71fb29694e-registry-certificates\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.058645 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.058508 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a07f3c68-23f1-4479-83d7-6e71fb29694e-bound-sa-token\") on node \"ip-10-0-133-42.ec2.internal\" DevicePath \"\"" Apr 22 18:47:46.104955 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.104924 2571 generic.go:358] "Generic (PLEG): container finished" podID="a07f3c68-23f1-4479-83d7-6e71fb29694e" containerID="7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e" exitCode=0 Apr 22 18:47:46.105099 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.104987 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" Apr 22 18:47:46.105099 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.105012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" event={"ID":"a07f3c68-23f1-4479-83d7-6e71fb29694e","Type":"ContainerDied","Data":"7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e"} Apr 22 18:47:46.105099 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.105060 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d6f96f686-fwq9s" event={"ID":"a07f3c68-23f1-4479-83d7-6e71fb29694e","Type":"ContainerDied","Data":"da878ea5d4c3c06f81b677c17e431b8556773d88a9b8b173475b5cb00470d6ff"} Apr 22 18:47:46.105099 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.105086 2571 scope.go:117] "RemoveContainer" containerID="7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e" Apr 22 18:47:46.113456 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.113427 2571 scope.go:117] "RemoveContainer" containerID="7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e" Apr 22 18:47:46.113749 ip-10-0-133-42 kubenswrapper[2571]: E0422 18:47:46.113722 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e\": container with ID starting with 7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e not found: ID does not exist" containerID="7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e" Apr 22 18:47:46.113853 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.113757 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e"} err="failed to get container status \"7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e\": rpc error: code = NotFound desc = could not find container \"7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e\": container with ID starting with 7ae1ebc56c5414454b678784001641a06d04ef50a7657dcd6b128c91dc94a54e not found: ID does not exist" Apr 22 18:47:46.126561 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.126483 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7d6f96f686-fwq9s"] Apr 22 18:47:46.131696 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.131671 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7d6f96f686-fwq9s"] Apr 22 18:47:46.427049 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:46.426968 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a07f3c68-23f1-4479-83d7-6e71fb29694e" path="/var/lib/kubelet/pods/a07f3c68-23f1-4479-83d7-6e71fb29694e/volumes" Apr 22 18:47:48.112511 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:48.112474 2571 generic.go:358] "Generic (PLEG): container finished" podID="166524a6-4f60-49b8-9020-7bae0d51168c" containerID="0ce205a5b238b83db67e2bfbc7b90736eafda039816dd499cd3e75dc35be6274" exitCode=0 Apr 22 18:47:48.112896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:48.112561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" event={"ID":"166524a6-4f60-49b8-9020-7bae0d51168c","Type":"ContainerDied","Data":"0ce205a5b238b83db67e2bfbc7b90736eafda039816dd499cd3e75dc35be6274"} Apr 22 18:47:48.112896 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:48.112891 2571 scope.go:117] "RemoveContainer" containerID="0ce205a5b238b83db67e2bfbc7b90736eafda039816dd499cd3e75dc35be6274" Apr 22 18:47:49.116799 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:49.116761 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwsv8" event={"ID":"166524a6-4f60-49b8-9020-7bae0d51168c","Type":"ContainerStarted","Data":"003ee51d93053c10981da6d12a759cf155c706b9995a73d8046e37e263282481"} Apr 22 18:47:58.144429 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:58.144345 2571 generic.go:358] "Generic (PLEG): container finished" podID="80e69f6d-8332-4923-a03d-15387087dc5a" containerID="aededa3255565b070f1befd7a503d0a7a5b427cb740d6bcc54d857cd1c88ce06" exitCode=0 Apr 22 18:47:58.144870 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:58.144421 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tnfzj" event={"ID":"80e69f6d-8332-4923-a03d-15387087dc5a","Type":"ContainerDied","Data":"aededa3255565b070f1befd7a503d0a7a5b427cb740d6bcc54d857cd1c88ce06"} Apr 22 18:47:58.144870 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:58.144767 2571 scope.go:117] "RemoveContainer" containerID="aededa3255565b070f1befd7a503d0a7a5b427cb740d6bcc54d857cd1c88ce06" Apr 22 18:47:59.148924 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:47:59.148878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tnfzj" event={"ID":"80e69f6d-8332-4923-a03d-15387087dc5a","Type":"ContainerStarted","Data":"2728d1a1299de49b18016dffaa6092a1fce56425ed0362fa696b5f0ef181dfd9"} Apr 22 18:48:03.895614 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:03.895575 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" podUID="714499d9-f67c-41fd-bb2a-dcdf1721b2b9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:48:07.173594 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:07.173531 2571 generic.go:358] "Generic (PLEG): container finished" podID="38e2749f-84dd-40f0-b864-c5aaddc913a8" containerID="880f0e3cb7e6e882ebca9dbe427ae265e252894fa3f91d33a61dd2da4c2f8430" exitCode=0 Apr 22 18:48:07.173962 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:07.173613 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" event={"ID":"38e2749f-84dd-40f0-b864-c5aaddc913a8","Type":"ContainerDied","Data":"880f0e3cb7e6e882ebca9dbe427ae265e252894fa3f91d33a61dd2da4c2f8430"} Apr 22 18:48:07.174023 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:07.174003 2571 scope.go:117] "RemoveContainer" containerID="880f0e3cb7e6e882ebca9dbe427ae265e252894fa3f91d33a61dd2da4c2f8430" Apr 22 18:48:08.177762 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:08.177730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vqvdc" event={"ID":"38e2749f-84dd-40f0-b864-c5aaddc913a8","Type":"ContainerStarted","Data":"1399c8243042c23d919cecfba602c5a96468c0209d43701378702aaf1b48055e"} Apr 22 18:48:13.896262 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:13.896221 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" podUID="714499d9-f67c-41fd-bb2a-dcdf1721b2b9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:48:23.895842 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:23.895802 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" podUID="714499d9-f67c-41fd-bb2a-dcdf1721b2b9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:48:23.896216 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:23.895882 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" Apr 22 18:48:23.896521 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:23.896503 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"7eba6fd42b41030f8932bf229a4c3f43313db3c5d21db1a46e747965f0341915"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:48:23.896587 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:23.896570 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" podUID="714499d9-f67c-41fd-bb2a-dcdf1721b2b9" containerName="service-proxy" containerID="cri-o://7eba6fd42b41030f8932bf229a4c3f43313db3c5d21db1a46e747965f0341915" gracePeriod=30 Apr 22 18:48:24.230250 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:24.230214 2571 generic.go:358] "Generic (PLEG): container finished" podID="714499d9-f67c-41fd-bb2a-dcdf1721b2b9" containerID="7eba6fd42b41030f8932bf229a4c3f43313db3c5d21db1a46e747965f0341915" exitCode=2 Apr 22 18:48:24.230408 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:24.230281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" event={"ID":"714499d9-f67c-41fd-bb2a-dcdf1721b2b9","Type":"ContainerDied","Data":"7eba6fd42b41030f8932bf229a4c3f43313db3c5d21db1a46e747965f0341915"} Apr 22 18:48:24.230408 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:48:24.230318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74878b968b-8gv5r" event={"ID":"714499d9-f67c-41fd-bb2a-dcdf1721b2b9","Type":"ContainerStarted","Data":"c7c5bfa4a09ab3f2a4d27711b4eb43911f32836fc27716270a58179a333dbe28"} Apr 22 18:50:48.328265 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:50:48.328237 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 18:50:48.329209 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:50:48.329184 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 18:50:48.333430 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:50:48.333410 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 18:50:48.334176 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:50:48.334156 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 18:55:48.350891 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:55:48.350863 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 18:55:48.352784 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:55:48.352761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 18:55:48.355508 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:55:48.355488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 18:55:48.357217 ip-10-0-133-42 kubenswrapper[2571]: I0422 18:55:48.357199 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 19:00:48.370642 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:00:48.370608 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 19:00:48.373838 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:00:48.373816 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 19:00:48.375158 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:00:48.375137 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 19:00:48.378796 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:00:48.378774 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 19:05:48.391340 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:05:48.391289 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 19:05:48.395608 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:05:48.395587 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 19:05:48.397294 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:05:48.397275 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 19:05:48.401220 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:05:48.401201 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 19:08:18.190349 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:18.190258 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fk9jc_f793756b-ba29-4f1d-878b-9d4abe4d5ad3/global-pull-secret-syncer/0.log" Apr 22 19:08:18.275017 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:18.274989 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2s28b_91da0166-33f7-46f2-9824-bf2339c00a28/konnectivity-agent/0.log" Apr 22 19:08:18.371621 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:18.371592 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-42.ec2.internal_380fabb8e9eee8f3530eb1504d622a92/haproxy/0.log" Apr 22 19:08:22.131239 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:22.131203 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-9jp77_d5b60b0a-0c13-4d95-83e6-b6d336565a6b/cluster-monitoring-operator/0.log" Apr 22 19:08:22.282636 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:22.282605 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fclvv_e623106d-3302-4ee7-b9f7-30710433e6c8/node-exporter/0.log" Apr 22 19:08:22.300552 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:22.300505 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fclvv_e623106d-3302-4ee7-b9f7-30710433e6c8/kube-rbac-proxy/0.log" Apr 22 19:08:22.320420 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:22.320401 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fclvv_e623106d-3302-4ee7-b9f7-30710433e6c8/init-textfile/0.log" Apr 22 19:08:24.157785 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:24.157756 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-2fc7s_a06f033a-b65b-48a1-bac1-47daf3491118/networking-console-plugin/0.log" Apr 22 19:08:24.525635 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:24.525606 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/2.log" Apr 22 19:08:24.533290 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:24.533259 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5h829_12bf8f63-f32e-40cf-b529-9b5c1f6a9053/console-operator/3.log" Apr 22 19:08:25.128904 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.128187 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x"] Apr 22 19:08:25.130337 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.128853 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a07f3c68-23f1-4479-83d7-6e71fb29694e" containerName="registry" Apr 22 19:08:25.130337 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.129152 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07f3c68-23f1-4479-83d7-6e71fb29694e" containerName="registry" Apr 22 19:08:25.130337 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.129336 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a07f3c68-23f1-4479-83d7-6e71fb29694e" containerName="registry" Apr 22 19:08:25.132985 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.132965 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.136860 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.136835 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k99vz\"/\"openshift-service-ca.crt\"" Apr 22 19:08:25.138014 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.137989 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k99vz\"/\"kube-root-ca.crt\"" Apr 22 19:08:25.138133 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.138043 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k99vz\"/\"default-dockercfg-q7x9l\"" Apr 22 19:08:25.138388 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.138370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x"] Apr 22 19:08:25.195781 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.195751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm86p\" (UniqueName: \"kubernetes.io/projected/34710fac-a607-48e9-96c5-70985cb5ff40-kube-api-access-rm86p\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.196138 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.195801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-proc\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.196138 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.195846 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-sys\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.196138 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.195878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-lib-modules\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.196138 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.195894 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-podres\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.283013 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.282984 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-fdhpc_a1739ddc-1770-4db7-a55f-9bc8b4cf2c65/volume-data-source-validator/0.log" Apr 22 19:08:25.296437 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm86p\" (UniqueName: \"kubernetes.io/projected/34710fac-a607-48e9-96c5-70985cb5ff40-kube-api-access-rm86p\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296607 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296455 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-proc\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296607 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-sys\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296607 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-lib-modules\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296607 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296506 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-podres\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296607 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296545 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-proc\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296607 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-sys\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296859 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-podres\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.296859 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.296637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34710fac-a607-48e9-96c5-70985cb5ff40-lib-modules\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.307750 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.307726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm86p\" (UniqueName: \"kubernetes.io/projected/34710fac-a607-48e9-96c5-70985cb5ff40-kube-api-access-rm86p\") pod \"perf-node-gather-daemonset-6p68x\" (UID: \"34710fac-a607-48e9-96c5-70985cb5ff40\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.444675 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.444647 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:25.563188 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.563150 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x"] Apr 22 19:08:25.566158 ip-10-0-133-42 kubenswrapper[2571]: W0422 19:08:25.566123 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod34710fac_a607_48e9_96c5_70985cb5ff40.slice/crio-e8fe2032d621a85d6c379062d50a3cbda5a13beeee4a3bf5d086d478afcadf05 WatchSource:0}: Error finding container e8fe2032d621a85d6c379062d50a3cbda5a13beeee4a3bf5d086d478afcadf05: Status 404 returned error can't find the container with id e8fe2032d621a85d6c379062d50a3cbda5a13beeee4a3bf5d086d478afcadf05 Apr 22 19:08:25.567778 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.567762 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:08:25.605892 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:25.605866 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" event={"ID":"34710fac-a607-48e9-96c5-70985cb5ff40","Type":"ContainerStarted","Data":"e8fe2032d621a85d6c379062d50a3cbda5a13beeee4a3bf5d086d478afcadf05"} Apr 22 19:08:26.018974 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:26.018946 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lb4jp_f9d429a1-a93e-47db-bed8-196a5bb0f748/dns/0.log" Apr 22 19:08:26.038079 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:26.038052 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lb4jp_f9d429a1-a93e-47db-bed8-196a5bb0f748/kube-rbac-proxy/0.log" Apr 22 19:08:26.080270 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:26.080248 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzzp6_0c75569b-ad2e-4296-8dad-807a8c913df1/dns-node-resolver/0.log" Apr 22 19:08:26.543609 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:26.543579 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rvx2v_e63b9430-ddb4-4626-8019-7bfa90ffac77/node-ca/0.log" Apr 22 19:08:26.610073 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:26.610036 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" event={"ID":"34710fac-a607-48e9-96c5-70985cb5ff40","Type":"ContainerStarted","Data":"b2493f85fc97fe4de86ff8016d1584f5ef93cbc876aa939c167089455ed33961"} Apr 22 19:08:26.610238 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:26.610158 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:26.625401 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:26.625355 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" podStartSLOduration=1.625315682 podStartE2EDuration="1.625315682s" podCreationTimestamp="2026-04-22 19:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:26.624710775 +0000 UTC m=+1358.723060715" watchObservedRunningTime="2026-04-22 19:08:26.625315682 +0000 UTC m=+1358.723665671" Apr 22 19:08:27.220163 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:27.220132 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-56d9858896-wq4xk_0e27f58c-14be-45e2-8336-0577f514ae76/router/0.log" Apr 22 19:08:27.545378 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:27.545287 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4njjm_e23ae0a9-f762-4687-bfe0-c02d473142be/serve-healthcheck-canary/0.log" Apr 22 19:08:27.912840 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:27.912743 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-tnfzj_80e69f6d-8332-4923-a03d-15387087dc5a/insights-operator/1.log" Apr 22 19:08:27.912997 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:27.912910 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-tnfzj_80e69f6d-8332-4923-a03d-15387087dc5a/insights-operator/0.log" Apr 22 19:08:27.932072 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:27.932052 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-42q2k_a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2/kube-rbac-proxy/0.log" Apr 22 19:08:27.951059 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:27.951029 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-42q2k_a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2/exporter/0.log" Apr 22 19:08:27.970678 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:27.970655 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-42q2k_a7a0f4c6-c0ab-46f3-ad8c-e576f17ea7d2/extractor/0.log" Apr 22 19:08:32.621982 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:32.621950 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-6p68x" Apr 22 19:08:33.639774 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:33.639739 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-txhb9_2a94e10b-e40c-4bfa-99b0-a02b6e36263e/migrator/0.log" Apr 22 19:08:33.681854 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:33.681822 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-txhb9_2a94e10b-e40c-4bfa-99b0-a02b6e36263e/graceful-termination/0.log" Apr 22 19:08:34.075903 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:34.075872 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qwsv8_166524a6-4f60-49b8-9020-7bae0d51168c/kube-storage-version-migrator-operator/1.log" Apr 22 19:08:34.077496 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:34.077469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qwsv8_166524a6-4f60-49b8-9020-7bae0d51168c/kube-storage-version-migrator-operator/0.log" Apr 22 19:08:35.210501 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.210474 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brbjp_22922279-9d57-4b39-9e9b-25a133f37c1b/kube-multus-additional-cni-plugins/0.log" Apr 22 19:08:35.230003 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.229973 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brbjp_22922279-9d57-4b39-9e9b-25a133f37c1b/egress-router-binary-copy/0.log" Apr 22 19:08:35.248988 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.248961 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brbjp_22922279-9d57-4b39-9e9b-25a133f37c1b/cni-plugins/0.log" Apr 22 19:08:35.267548 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.267508 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brbjp_22922279-9d57-4b39-9e9b-25a133f37c1b/bond-cni-plugin/0.log" Apr 22 19:08:35.286475 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.286448 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brbjp_22922279-9d57-4b39-9e9b-25a133f37c1b/routeoverride-cni/0.log" Apr 22 19:08:35.305897 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.305874 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brbjp_22922279-9d57-4b39-9e9b-25a133f37c1b/whereabouts-cni-bincopy/0.log" Apr 22 19:08:35.325647 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.325625 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brbjp_22922279-9d57-4b39-9e9b-25a133f37c1b/whereabouts-cni/0.log" Apr 22 19:08:35.487619 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.487488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsb2f_4867ac2c-3d1c-44a3-b5d4-495f207482ed/kube-multus/0.log" Apr 22 19:08:35.659753 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.659723 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v6b2n_e0d458b0-40cd-4eaf-8dbf-220566ae55ef/network-metrics-daemon/0.log" Apr 22 19:08:35.677753 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:35.677726 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v6b2n_e0d458b0-40cd-4eaf-8dbf-220566ae55ef/kube-rbac-proxy/0.log" Apr 22 19:08:36.680710 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.680619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-controller/0.log" Apr 22 19:08:36.695892 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.695852 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/0.log" Apr 22 19:08:36.708128 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.708100 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovn-acl-logging/1.log" Apr 22 19:08:36.727997 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.727967 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/kube-rbac-proxy-node/0.log" Apr 22 19:08:36.749275 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.749249 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:08:36.765496 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.765467 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/northd/0.log" Apr 22 19:08:36.784770 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.784748 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/nbdb/0.log" Apr 22 19:08:36.805710 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.805683 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/sbdb/0.log" Apr 22 19:08:36.948718 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:36.948677 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vmbf4_97c5443c-b607-45ba-8245-88b3b1af7d19/ovnkube-controller/0.log" Apr 22 19:08:38.221664 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:38.221636 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-kxklx_a2f5711c-2c58-47dd-a89b-ab792485adbf/check-endpoints/0.log" Apr 22 19:08:38.284511 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:38.284478 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bvpqn_fb5bea5b-4447-44e8-8573-662eda69835e/network-check-target-container/0.log" Apr 22 19:08:39.121553 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:39.121505 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gvq84_64e165b4-deda-4d6f-8f70-c28ac7cebec4/iptables-alerter/0.log" Apr 22 19:08:39.673964 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:39.673926 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4wgsq_72131875-3a6a-454e-a845-bdca533f20de/tuned/0.log" Apr 22 19:08:41.385790 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:41.385761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-fnw4t_e8cf109c-0a09-449e-87f3-54ad5a412455/cluster-samples-operator/0.log" Apr 22 19:08:41.408627 ip-10-0-133-42 kubenswrapper[2571]: I0422 19:08:41.408598 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-fnw4t_e8cf109c-0a09-449e-87f3-54ad5a412455/cluster-samples-operator-watch/0.log"