Apr 22 18:33:30.051979 ip-10-0-131-5 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:33:30.051991 ip-10-0-131-5 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:33:30.051998 ip-10-0-131-5 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:33:30.052209 ip-10-0-131-5 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:33:40.219638 ip-10-0-131-5 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:33:40.219656 ip-10-0-131-5 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f82805f37cbf4277bb5b084a896ab38a -- Apr 22 18:36:05.021630 ip-10-0-131-5 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:36:05.460645 ip-10-0-131-5 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:05.460645 ip-10-0-131-5 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:36:05.460645 ip-10-0-131-5 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:05.460645 ip-10-0-131-5 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:36:05.460645 ip-10-0-131-5 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:05.461778 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.461346 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:36:05.464379 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464363 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:05.464379 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464377 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:05.464379 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464381 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464385 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464389 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464392 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464395 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464398 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464402 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464414 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464418 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464421 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464424 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464427 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464429 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464432 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464434 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464437 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464440 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464442 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464445 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464447 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:05.464467 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464452 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464455 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464458 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464461 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464464 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464467 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464470 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464473 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464476 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464479 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464482 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464485 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464487 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464490 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464492 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464495 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464497 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464500 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464502 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464505 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:05.464947 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464508 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464510 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464513 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464515 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464519 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464522 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464525 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464527 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464530 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464532 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464535 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464537 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464540 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464542 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464546 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464550 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464553 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464555 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464558 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:05.465460 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464560 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464563 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464566 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464569 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464572 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464574 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464577 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464579 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464584 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464588 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464591 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464593 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464596 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464599 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464601 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464604 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464606 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464609 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464612 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:05.465920 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464614 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464617 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464619 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464622 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464624 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.464627 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465003 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465008 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465011 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465015 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465017 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465020 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465023 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465025 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465028 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465031 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465034 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465036 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465039 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465042 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:05.466435 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465044 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465047 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465049 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465052 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465054 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465057 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465059 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465062 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465064 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465067 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465070 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465072 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465075 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465078 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465080 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465083 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465086 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465088 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465091 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465094 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:05.466930 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465097 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465099 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465102 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465105 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465107 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465110 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465113 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465116 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465118 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465121 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465124 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465128 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465130 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465133 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465136 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465138 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465141 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465144 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465146 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465149 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:05.467445 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465151 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465155 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465157 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465160 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465163 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465165 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465168 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465170 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465173 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465175 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465177 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465181 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465185 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465189 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465192 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465195 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465198 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465201 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465204 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:05.467944 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465207 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465209 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465212 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465214 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465217 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465219 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465222 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465227 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465229 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465232 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465234 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465237 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.465239 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465852 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465862 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465868 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465873 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465877 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465881 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465885 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465890 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:36:05.468420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465893 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465896 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465899 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465903 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465906 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465909 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465912 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465915 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465924 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465928 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465931 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465938 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465941 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465944 2572 flags.go:64] FLAG: --config-dir="" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465947 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465950 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465954 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465957 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465962 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465965 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465969 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465972 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465974 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465978 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465981 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:36:05.469013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465985 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465988 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465991 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465994 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.465997 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466000 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466009 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466012 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466015 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466018 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466021 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466025 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466028 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466031 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466035 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466037 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466046 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466049 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466052 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466055 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466058 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466060 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466064 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466067 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466070 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:36:05.469645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466075 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466078 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466081 2572 flags.go:64] FLAG: --help="false" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466084 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466087 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466090 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466093 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466097 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466100 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466103 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466106 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466109 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466112 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466115 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466118 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466121 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466124 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466127 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466130 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466132 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466136 2572 flags.go:64] FLAG: --lock-file="" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466138 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466141 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466144 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:36:05.470254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466155 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466158 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466161 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466164 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466167 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466170 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466173 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466176 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466185 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466188 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466193 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466196 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466199 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466202 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466205 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466208 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466211 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466214 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466222 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466225 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466228 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466231 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466234 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:36:05.470860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466239 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466242 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466245 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466248 2572 flags.go:64] FLAG: --port="10250" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466251 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466254 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03afbc3d41e92b455" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466257 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466260 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466263 2572 flags.go:64] FLAG: --register-node="true" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466266 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466275 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466279 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466281 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466284 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466287 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466296 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466300 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466304 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466307 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466310 2572 flags.go:64] FLAG: --runonce="false" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466313 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466316 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466319 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466322 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466340 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466344 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:36:05.471451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466347 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466350 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466353 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466356 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466359 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466362 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466365 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466369 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466371 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466377 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466380 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466388 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466396 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466399 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466402 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466405 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466408 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466411 2572 flags.go:64] FLAG: --v="2" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466415 2572 flags.go:64] FLAG: --version="false" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466419 2572 flags.go:64] FLAG: --vmodule="" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466423 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.466427 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466527 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466533 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466537 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:05.472087 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466540 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466543 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466545 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466548 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466550 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466555 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466557 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466560 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466563 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466565 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466568 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466570 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466573 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466575 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466578 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466580 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466583 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466585 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466588 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466590 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:05.473090 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466593 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466596 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466599 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466603 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466607 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466610 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466613 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466616 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466618 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466621 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466624 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466627 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466630 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466632 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466635 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466638 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466640 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466644 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466647 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:05.473623 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466649 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466652 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466654 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466657 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466660 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466662 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466665 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466667 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466670 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466672 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466675 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466677 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466680 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466683 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466685 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466688 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466690 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466693 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466695 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466698 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:05.474288 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466701 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466704 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466706 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466710 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466712 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466715 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466717 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466720 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466723 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466725 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466729 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466731 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466734 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466737 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466741 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466744 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466746 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466749 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466752 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:05.475206 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466754 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:05.476067 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466757 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:05.476067 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466759 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:05.476067 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466762 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:05.476067 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.466765 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:05.476067 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.467307 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:05.476067 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.475856 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:36:05.476067 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.475876 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476095 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476121 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476125 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476129 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476132 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476135 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476139 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476142 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476145 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476148 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476151 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476154 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476157 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476160 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476165 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476168 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476171 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476174 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476178 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:05.476324 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476181 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476184 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476305 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476311 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476315 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476318 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476321 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476324 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476341 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476344 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476347 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476349 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476352 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476355 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476362 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476365 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476368 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476371 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476374 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:05.476932 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476377 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476380 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476383 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476386 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476389 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476392 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476394 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476397 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476400 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476403 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476406 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476410 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476412 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476415 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476417 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476420 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476423 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476425 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476428 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476431 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:05.477417 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476433 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476436 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476438 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476441 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476443 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476446 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476449 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476453 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476456 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476458 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476461 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476463 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476466 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476470 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476474 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476477 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476481 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476483 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476486 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476489 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:05.477895 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476492 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476495 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476497 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476501 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476503 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476506 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476508 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476512 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.476517 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476620 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476625 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476628 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476631 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476634 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476638 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:05.478397 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476640 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476643 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476646 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476649 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476652 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476655 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476657 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476660 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476663 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476666 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476668 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476671 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476674 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476676 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476678 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476681 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476684 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476686 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476689 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476691 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:05.478792 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476694 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476697 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476699 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476702 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476704 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476707 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476710 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476713 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476717 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476720 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476723 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476726 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476728 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476732 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476735 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476737 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476740 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476742 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476745 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476748 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:05.479290 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476750 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476753 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476755 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476758 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476761 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476763 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476766 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476768 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476771 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476774 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476777 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476780 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476784 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476787 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476790 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476793 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476795 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476798 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476800 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:05.479831 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476803 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476805 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476808 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476810 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476813 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476816 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476819 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476821 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476824 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476827 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476829 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476832 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476835 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476837 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476840 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476842 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476845 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476848 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476851 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476853 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:05.480295 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:05.476856 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:05.480810 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.476861 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:05.480810 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.478310 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:36:05.486003 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.485989 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:36:05.487539 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.487528 2572 server.go:1019] "Starting client certificate rotation" Apr 22 18:36:05.487635 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.487619 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:05.487670 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.487653 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:05.518405 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.518385 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:05.521091 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.521076 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:05.538387 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.538370 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:36:05.544128 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.544108 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:05.545531 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.545519 2572 log.go:25] "Validated CRI v1 image API" Apr 22 18:36:05.546882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.546862 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:36:05.549204 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.549184 2572 fs.go:135] Filesystem UUIDs: map[34e53a16-4585-462f-b85c-00d62e4921df:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d8c2427e-e711-4cf4-b592-535e9e998252:/dev/nvme0n1p3] Apr 22 18:36:05.549290 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.549204 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:36:05.555093 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.554751 2572 manager.go:217] Machine: {Timestamp:2026-04-22 18:36:05.552993028 +0000 UTC m=+0.412161740 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098567 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25cca36e1151e47d1ac4b261cfd1a0 SystemUUID:ec25cca3-6e11-51e4-7d1a-c4b261cfd1a0 BootID:f82805f3-7cbf-4277-bb5b-084a896ab38a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:74:ad:78:9f:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:74:ad:78:9f:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:35:94:9d:49:c8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:36:05.555093 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.555080 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:36:05.555222 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.555199 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:36:05.557783 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.557763 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:36:05.557913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.557786 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-5.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:36:05.558460 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.558450 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:36:05.558504 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.558462 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:36:05.558504 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.558487 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:05.559276 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.559266 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:05.560735 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.560725 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:05.560841 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.560832 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:36:05.563500 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.563491 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:36:05.563538 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.563504 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:36:05.563538 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.563515 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:36:05.563538 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.563524 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:36:05.563538 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.563534 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:36:05.564861 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.564850 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:05.564913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.564867 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:05.567734 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.567718 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:36:05.569073 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.569037 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:36:05.571063 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571048 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:36:05.571063 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571067 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571074 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571080 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571085 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571091 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571098 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571103 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571111 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571117 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571126 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:36:05.571166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.571135 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:36:05.573225 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.573214 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:36:05.573225 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.573224 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:36:05.574567 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.574546 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:36:05.574634 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.574609 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:36:05.576616 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.576602 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:36:05.576676 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.576638 2572 server.go:1295] "Started kubelet" Apr 22 18:36:05.576757 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.576734 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:36:05.576804 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.576729 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:36:05.576804 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.576789 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:36:05.577475 ip-10-0-131-5 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:36:05.577830 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.577816 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:36:05.577899 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.577841 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:36:05.583174 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.583157 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:36:05.583174 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.583167 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:05.583872 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.583855 2572 factory.go:55] Registering systemd factory Apr 22 18:36:05.583950 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.583879 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:36:05.584013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.583860 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:36:05.584013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584001 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:36:05.584114 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584040 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-5.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:36:05.584114 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.583878 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:36:05.584202 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584168 2572 factory.go:153] Registering CRI-O factory Apr 22 18:36:05.584202 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584180 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 18:36:05.584290 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.584243 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:05.584362 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584180 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:36:05.584417 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584362 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:36:05.584635 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584616 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:36:05.584696 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584652 2572 factory.go:103] Registering Raw factory Apr 22 18:36:05.584696 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.584667 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 18:36:05.585130 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.584157 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-5.ec2.internal.18a8c1a492a301c1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-5.ec2.internal,UID:ip-10-0-131-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-5.ec2.internal,},FirstTimestamp:2026-04-22 18:36:05.576614337 +0000 UTC m=+0.435783049,LastTimestamp:2026-04-22 18:36:05.576614337 +0000 UTC m=+0.435783049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-5.ec2.internal,}" Apr 22 18:36:05.585738 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.585723 2572 manager.go:319] Starting recovery of all containers Apr 22 18:36:05.585841 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.585822 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:36:05.590049 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.590015 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:36:05.590049 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.590020 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:36:05.596912 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.596898 2572 manager.go:324] Recovery completed Apr 22 18:36:05.598065 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.598049 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:36:05.600857 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.600845 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:05.603302 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.603287 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:05.603393 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.603316 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:05.603393 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.603365 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:05.603876 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.603859 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:36:05.603876 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.603875 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:36:05.604002 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.603894 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:05.605925 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.605909 2572 policy_none.go:49] "None policy: Start" Apr 22 18:36:05.606001 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.605931 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:36:05.606001 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.605945 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:36:05.606095 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.606005 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-5.ec2.internal.18a8c1a4943a3cf1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-5.ec2.internal,UID:ip-10-0-131-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-5.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-5.ec2.internal,},FirstTimestamp:2026-04-22 18:36:05.603302641 +0000 UTC m=+0.462471355,LastTimestamp:2026-04-22 18:36:05.603302641 +0000 UTC m=+0.462471355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-5.ec2.internal,}" Apr 22 18:36:05.617625 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.617562 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-5.ec2.internal.18a8c1a4943a8b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-5.ec2.internal,UID:ip-10-0-131-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-5.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-5.ec2.internal,},FirstTimestamp:2026-04-22 18:36:05.603322668 +0000 UTC m=+0.462491380,LastTimestamp:2026-04-22 18:36:05.603322668 +0000 UTC m=+0.462491380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-5.ec2.internal,}" Apr 22 18:36:05.627865 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.627792 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-5.ec2.internal.18a8c1a4943b4c40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-5.ec2.internal,UID:ip-10-0-131-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-5.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-5.ec2.internal,},FirstTimestamp:2026-04-22 18:36:05.603372096 +0000 UTC m=+0.462540811,LastTimestamp:2026-04-22 18:36:05.603372096 +0000 UTC m=+0.462540811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-5.ec2.internal,}" Apr 22 18:36:05.651907 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.651891 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.651933 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.651946 2572 server.go:85] "Starting device plugin registration server" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.652164 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.652175 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.652269 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.652361 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.652370 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.652825 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:36:05.652910 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.652854 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:05.663080 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.663015 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-5.ec2.internal.18a8c1a4973e3355 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-5.ec2.internal,UID:ip-10-0-131-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-131-5.ec2.internal,},FirstTimestamp:2026-04-22 18:36:05.653893973 +0000 UTC m=+0.513062671,LastTimestamp:2026-04-22 18:36:05.653893973 +0000 UTC m=+0.513062671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-5.ec2.internal,}" Apr 22 18:36:05.666916 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.666901 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6pfl4" Apr 22 18:36:05.675146 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.675118 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6pfl4" Apr 22 18:36:05.729101 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.729027 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:36:05.730261 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.730245 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:36:05.730343 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.730273 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:36:05.730343 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.730292 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:36:05.730343 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.730299 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:36:05.730464 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.730394 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:36:05.740869 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.740845 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:05.752926 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.752912 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:05.753624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.753610 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:05.753683 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.753639 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:05.753683 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.753650 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:05.753683 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.753674 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.763077 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.763062 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.763129 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.763081 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-5.ec2.internal\": node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:05.779552 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.779534 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:05.830833 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.830806 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal"] Apr 22 18:36:05.830918 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.830905 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:05.833082 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.833067 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:05.833143 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.833099 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:05.833143 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.833110 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:05.834211 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.834200 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:05.834862 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.834839 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:05.834862 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.834865 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:05.834986 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.834878 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:05.834986 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.834921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.834986 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.834987 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:05.835712 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.835699 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:05.835780 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.835724 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:05.835780 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.835737 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:05.835887 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.835857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.835887 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.835878 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:05.836484 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.836468 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:05.836578 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.836496 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:05.836578 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.836508 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:05.862850 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.862832 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-5.ec2.internal\" not found" node="ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.866998 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.866978 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-5.ec2.internal\" not found" node="ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.880028 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.880009 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:05.885966 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.885944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b50e7600b47f76668e274e014c99f3ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"b50e7600b47f76668e274e014c99f3ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.886058 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.885968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b50e7600b47f76668e274e014c99f3ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"b50e7600b47f76668e274e014c99f3ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.886058 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.885991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/55d83b29c984d704f2c407ca2173be08-config\") pod \"kube-apiserver-proxy-ip-10-0-131-5.ec2.internal\" (UID: \"55d83b29c984d704f2c407ca2173be08\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.980694 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:05.980618 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:05.986145 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.986126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b50e7600b47f76668e274e014c99f3ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"b50e7600b47f76668e274e014c99f3ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.986198 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.986154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b50e7600b47f76668e274e014c99f3ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"b50e7600b47f76668e274e014c99f3ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.986198 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.986171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/55d83b29c984d704f2c407ca2173be08-config\") pod \"kube-apiserver-proxy-ip-10-0-131-5.ec2.internal\" (UID: \"55d83b29c984d704f2c407ca2173be08\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.986264 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.986213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b50e7600b47f76668e274e014c99f3ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"b50e7600b47f76668e274e014c99f3ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.986264 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.986259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b50e7600b47f76668e274e014c99f3ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"b50e7600b47f76668e274e014c99f3ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:05.986322 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:05.986292 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/55d83b29c984d704f2c407ca2173be08-config\") pod \"kube-apiserver-proxy-ip-10-0-131-5.ec2.internal\" (UID: \"55d83b29c984d704f2c407ca2173be08\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Apr 22 18:36:06.081552 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.081514 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:06.164072 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.164036 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:06.169626 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.169605 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Apr 22 18:36:06.182324 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.182303 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:06.282511 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.282439 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:06.382917 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.382893 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Apr 22 18:36:06.400869 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.400848 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:06.484023 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.484000 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:06.486909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.486895 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:36:06.487076 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.487059 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:06.487112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.487057 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:06.487112 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.487039 2572 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ab529ce1a2d38443a8d8fd8296197706-bf87a9218e9e9494.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.131.5:59024->13.216.108.216:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:06.487112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.487101 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Apr 22 18:36:06.503298 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.503277 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:06.558845 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.558801 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:06.563757 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.563739 2572 apiserver.go:52] "Watching apiserver" Apr 22 18:36:06.570372 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.570237 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:36:06.570574 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.570553 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-5vfs6","kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal","openshift-dns/node-resolver-hjctd","openshift-multus/multus-additional-cni-plugins-zw6cn","openshift-multus/multus-jl675","openshift-ovn-kubernetes/ovnkube-node-vjgnk","kube-system/konnectivity-agent-wlpch","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l","openshift-cluster-node-tuning-operator/tuned-p4kkw","openshift-image-registry/node-ca-tpthn","openshift-multus/network-metrics-daemon-k7crw","openshift-network-diagnostics/network-check-target-5h8ps"] Apr 22 18:36:06.573137 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.573118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.574114 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.574094 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.574206 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.574178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.575622 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.575601 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:36:06.575792 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.575777 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:06.575858 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.575835 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:06.575858 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.575854 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jl675" Apr 22 18:36:06.575957 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.575930 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hbmj6\"" Apr 22 18:36:06.576222 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.576207 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:36:06.576535 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.576519 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:36:06.576535 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.576526 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:36:06.576694 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.576587 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jcfwz\"" Apr 22 18:36:06.576694 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.576648 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8k8nq\"" Apr 22 18:36:06.576833 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.576818 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:36:06.576896 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.576884 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:36:06.577374 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.577352 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:36:06.577513 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.577382 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.577513 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.577357 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:36:06.577901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.577884 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:36:06.578490 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.578464 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6gsp2\"" Apr 22 18:36:06.578836 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.578822 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.579505 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.579481 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:36:06.579678 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.579663 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:36:06.579790 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.579770 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:36:06.579868 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.579773 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.580499 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.580484 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:36:06.580499 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.580498 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-znz9g\"" Apr 22 18:36:06.580663 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.580523 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:36:06.580663 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.580571 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:36:06.580928 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.580913 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.581023 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.581006 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:36:06.581099 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.581006 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ptg64\"" Apr 22 18:36:06.581142 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.581008 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:36:06.582100 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.582079 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:36:06.582437 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.582420 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:36:06.582925 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.582899 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:36:06.583152 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.583136 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:06.583230 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.583153 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lx5wx\"" Apr 22 18:36:06.583306 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.583291 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:06.586544 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.586528 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:06.586672 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.586654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-57dw5\"" Apr 22 18:36:06.586788 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.586774 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.587053 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.586887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:06.587151 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.587127 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:06.588496 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.588479 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:06.588680 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.588658 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:06.588921 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.588897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-cni-binary-copy\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.589006 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.588947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysconfig\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.589006 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.588980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-systemd\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.589106 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-etc-kubernetes\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.589106 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1294e5e-31d1-48a2-8134-4d7b0f658d42-hosts-file\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.589106 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-run-netns\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.589234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589104 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:36:06.589234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589105 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-kubelet\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.589234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589179 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-sys\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.589234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-host\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.589436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a7f054c-e2d0-4250-be22-6160ebb37eec-cni-binary-copy\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.589436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-daemon-config\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.589436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-log-socket\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.589436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-socket-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.589436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589371 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-789rh\"" Apr 22 18:36:06.589436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-lib-modules\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.589436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d8425f70-4f14-4d86-b30e-3abe38269764-serviceca\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589440 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-systemd-units\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/915264b6-6df0-4100-9a03-985c5f546a4b-agent-certs\") pod \"konnectivity-agent-wlpch\" (UID: \"915264b6-6df0-4100-9a03-985c5f546a4b\") " pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589490 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589507 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589542 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-tuned\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9zb\" (UniqueName: \"kubernetes.io/projected/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-kube-api-access-4m9zb\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589607 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.589745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovnkube-script-lib\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87vg\" (UniqueName: \"kubernetes.io/projected/ba489902-99d2-4aa8-afc6-aac5da21ebe8-kube-api-access-d87vg\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqfw\" (UniqueName: \"kubernetes.io/projected/83267a92-55fb-45ae-8856-cfb92fa1ca05-kube-api-access-4pqfw\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-slash\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-cnibin\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.589923 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvdg7\" (UniqueName: \"kubernetes.io/projected/3a24b441-1f95-45b3-b520-483d996f771f-kube-api-access-xvdg7\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-etc-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.590177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-cni-bin\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.590678 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-env-overrides\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.590678 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysctl-conf\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.590678 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-var-lib-kubelet\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.590678 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-hostroot\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.590854 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8425f70-4f14-4d86-b30e-3abe38269764-host\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.590854 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-os-release\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.590942 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/915264b6-6df0-4100-9a03-985c5f546a4b-konnectivity-ca\") pod \"konnectivity-agent-wlpch\" (UID: \"915264b6-6df0-4100-9a03-985c5f546a4b\") " pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.590986 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06291473-0b0d-41e0-99f1-3d887d31c55e-host-slash\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.591035 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.590990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkhd\" (UniqueName: \"kubernetes.io/projected/06291473-0b0d-41e0-99f1-3d887d31c55e-kube-api-access-pgkhd\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.591035 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-system-cni-dir\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.591152 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-kubernetes\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.591152 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-run\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.591152 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-cni-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-cni-multus\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-var-lib-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-cni-netd\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-cnibin\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-system-cni-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-os-release\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-conf-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rtj9\" (UniqueName: \"kubernetes.io/projected/9a7f054c-e2d0-4250-be22-6160ebb37eec-kube-api-access-8rtj9\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-ovn\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdhl\" (UniqueName: \"kubernetes.io/projected/d1294e5e-31d1-48a2-8134-4d7b0f658d42-kube-api-access-qjdhl\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-node-log\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovnkube-config\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovn-node-metrics-cert\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.591909 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-sys-fs\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-modprobe-d\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.591975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06291473-0b0d-41e0-99f1-3d887d31c55e-iptables-alerter-script\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-socket-dir-parent\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-multus-certs\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-registration-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-device-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-etc-selinux\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-k8s-cni-cncf-io\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98m7l\" (UniqueName: \"kubernetes.io/projected/d8425f70-4f14-4d86-b30e-3abe38269764-kube-api-access-98m7l\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/83267a92-55fb-45ae-8856-cfb92fa1ca05-tmp\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-netns\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-kubelet\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-systemd\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7m9\" (UniqueName: \"kubernetes.io/projected/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-kube-api-access-2g7m9\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.592624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592457 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysctl-d\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.593355 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-cni-bin\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.593355 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:06.593355 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.592632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1294e5e-31d1-48a2-8134-4d7b0f658d42-tmp-dir\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.638557 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.638536 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:06.651675 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.651653 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dhbvm" Apr 22 18:36:06.658704 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.658687 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dhbvm" Apr 22 18:36:06.671619 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.671595 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d83b29c984d704f2c407ca2173be08.slice/crio-f2cf8b6f1d2d5ffd80cc9ecde8edb2110ea6b3d247af22053808269dd06bd535 WatchSource:0}: Error finding container f2cf8b6f1d2d5ffd80cc9ecde8edb2110ea6b3d247af22053808269dd06bd535: Status 404 returned error can't find the container with id f2cf8b6f1d2d5ffd80cc9ecde8edb2110ea6b3d247af22053808269dd06bd535 Apr 22 18:36:06.671983 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.671956 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb50e7600b47f76668e274e014c99f3ac.slice/crio-06f84f2a8ee49366fa5447679422383991b700ba4039fa80e8584426a6f35082 WatchSource:0}: Error finding container 06f84f2a8ee49366fa5447679422383991b700ba4039fa80e8584426a6f35082: Status 404 returned error can't find the container with id 06f84f2a8ee49366fa5447679422383991b700ba4039fa80e8584426a6f35082 Apr 22 18:36:06.676638 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.676618 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:36:06.676895 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.676868 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:31:05 +0000 UTC" deadline="2027-11-01 14:14:06.174775279 +0000 UTC" Apr 22 18:36:06.676947 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.676895 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13387h37m59.497882425s" Apr 22 18:36:06.684654 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.684640 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:36:06.693363 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-kubelet\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.693455 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-sys\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.693455 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-host\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.693455 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a7f054c-e2d0-4250-be22-6160ebb37eec-cni-binary-copy\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.693455 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-daemon-config\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-sys\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-host\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-log-socket\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-socket-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-lib-modules\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693547 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-log-socket\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d8425f70-4f14-4d86-b30e-3abe38269764-serviceca\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693455 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-kubelet\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-systemd-units\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/915264b6-6df0-4100-9a03-985c5f546a4b-agent-certs\") pod \"konnectivity-agent-wlpch\" (UID: \"915264b6-6df0-4100-9a03-985c5f546a4b\") " pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.693645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-tuned\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-lib-modules\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9zb\" (UniqueName: \"kubernetes.io/projected/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-kube-api-access-4m9zb\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693670 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-socket-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovnkube-script-lib\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d87vg\" (UniqueName: \"kubernetes.io/projected/ba489902-99d2-4aa8-afc6-aac5da21ebe8-kube-api-access-d87vg\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqfw\" (UniqueName: \"kubernetes.io/projected/83267a92-55fb-45ae-8856-cfb92fa1ca05-kube-api-access-4pqfw\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-systemd-units\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-slash\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-slash\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-cnibin\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.694140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d8425f70-4f14-4d86-b30e-3abe38269764-serviceca\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a7f054c-e2d0-4250-be22-6160ebb37eec-cni-binary-copy\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvdg7\" (UniqueName: \"kubernetes.io/projected/3a24b441-1f95-45b3-b520-483d996f771f-kube-api-access-xvdg7\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693973 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-etc-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694064 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-cnibin\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.693982 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694111 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-cni-bin\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-env-overrides\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysctl-conf\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-var-lib-kubelet\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-hostroot\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-cni-bin\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-daemon-config\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.694982 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-var-lib-kubelet\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8425f70-4f14-4d86-b30e-3abe38269764-host\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysctl-conf\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-hostroot\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694423 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-os-release\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8425f70-4f14-4d86-b30e-3abe38269764-host\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/915264b6-6df0-4100-9a03-985c5f546a4b-konnectivity-ca\") pod \"konnectivity-agent-wlpch\" (UID: \"915264b6-6df0-4100-9a03-985c5f546a4b\") " pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06291473-0b0d-41e0-99f1-3d887d31c55e-host-slash\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkhd\" (UniqueName: \"kubernetes.io/projected/06291473-0b0d-41e0-99f1-3d887d31c55e-kube-api-access-pgkhd\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-system-cni-dir\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovnkube-script-lib\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-kubernetes\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-run\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-env-overrides\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-cni-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-run\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-system-cni-dir\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.695744 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-cni-multus\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-etc-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-var-lib-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-kubernetes\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-cni-netd\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694784 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-var-lib-openvswitch\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694805 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-cnibin\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06291473-0b0d-41e0-99f1-3d887d31c55e-host-slash\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-cni-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-cni-netd\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-cni-multus\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694871 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-os-release\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-system-cni-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-os-release\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-system-cni-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a24b441-1f95-45b3-b520-483d996f771f-cnibin\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-conf-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/915264b6-6df0-4100-9a03-985c5f546a4b-konnectivity-ca\") pod \"konnectivity-agent-wlpch\" (UID: \"915264b6-6df0-4100-9a03-985c5f546a4b\") " pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.696585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rtj9\" (UniqueName: \"kubernetes.io/projected/9a7f054c-e2d0-4250-be22-6160ebb37eec-kube-api-access-8rtj9\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-conf-dir\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-os-release\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.694983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-ovn\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-ovn\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdhl\" (UniqueName: \"kubernetes.io/projected/d1294e5e-31d1-48a2-8134-4d7b0f658d42-kube-api-access-qjdhl\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-node-log\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovnkube-config\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovn-node-metrics-cert\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-sys-fs\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-node-log\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-modprobe-d\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06291473-0b0d-41e0-99f1-3d887d31c55e-iptables-alerter-script\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-socket-dir-parent\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-multus-certs\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-modprobe-d\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-registration-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.697446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-device-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-sys-fs\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-etc-selinux\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-multus-certs\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-k8s-cni-cncf-io\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-registration-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98m7l\" (UniqueName: \"kubernetes.io/projected/d8425f70-4f14-4d86-b30e-3abe38269764-kube-api-access-98m7l\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/83267a92-55fb-45ae-8856-cfb92fa1ca05-tmp\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-etc-selinux\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-netns\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovnkube-config\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-kubelet\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-netns\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-systemd\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695709 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-run-k8s-cni-cncf-io\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.697974 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7m9\" (UniqueName: \"kubernetes.io/projected/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-kube-api-access-2g7m9\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysctl-d\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-cni-bin\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1294e5e-31d1-48a2-8134-4d7b0f658d42-tmp-dir\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-cni-binary-copy\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695866 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysconfig\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-systemd\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-kubelet\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-etc-kubernetes\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-run-systemd\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1294e5e-31d1-48a2-8134-4d7b0f658d42-hosts-file\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-run-netns\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-host-run-netns\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-multus-socket-dir-parent\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1294e5e-31d1-48a2-8134-4d7b0f658d42-tmp-dir\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysctl-d\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-host-var-lib-cni-bin\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.698487 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06291473-0b0d-41e0-99f1-3d887d31c55e-iptables-alerter-script\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.696495 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-cni-binary-copy\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.696576 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:07.19653114 +0000 UTC m=+2.055699849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-sysconfig\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-systemd\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3a24b441-1f95-45b3-b520-483d996f771f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a7f054c-e2d0-4250-be22-6160ebb37eec-etc-kubernetes\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.695572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ba489902-99d2-4aa8-afc6-aac5da21ebe8-device-dir\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.696740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1294e5e-31d1-48a2-8134-4d7b0f658d42-hosts-file\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.697190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/83267a92-55fb-45ae-8856-cfb92fa1ca05-etc-tuned\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.697257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/915264b6-6df0-4100-9a03-985c5f546a4b-agent-certs\") pod \"konnectivity-agent-wlpch\" (UID: \"915264b6-6df0-4100-9a03-985c5f546a4b\") " pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.697815 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-ovn-node-metrics-cert\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.698958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.698198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/83267a92-55fb-45ae-8856-cfb92fa1ca05-tmp\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.702815 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.702785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqfw\" (UniqueName: \"kubernetes.io/projected/83267a92-55fb-45ae-8856-cfb92fa1ca05-kube-api-access-4pqfw\") pod \"tuned-p4kkw\" (UID: \"83267a92-55fb-45ae-8856-cfb92fa1ca05\") " pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.705055 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.704391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87vg\" (UniqueName: \"kubernetes.io/projected/ba489902-99d2-4aa8-afc6-aac5da21ebe8-kube-api-access-d87vg\") pod \"aws-ebs-csi-driver-node-hq44l\" (UID: \"ba489902-99d2-4aa8-afc6-aac5da21ebe8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.705055 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.704787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkhd\" (UniqueName: \"kubernetes.io/projected/06291473-0b0d-41e0-99f1-3d887d31c55e-kube-api-access-pgkhd\") pod \"iptables-alerter-5vfs6\" (UID: \"06291473-0b0d-41e0-99f1-3d887d31c55e\") " pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.705055 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.705002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7m9\" (UniqueName: \"kubernetes.io/projected/62a2a2c8-4324-4276-a6c1-57c3f81c4b5c-kube-api-access-2g7m9\") pod \"ovnkube-node-vjgnk\" (UID: \"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.706551 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.705504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdhl\" (UniqueName: \"kubernetes.io/projected/d1294e5e-31d1-48a2-8134-4d7b0f658d42-kube-api-access-qjdhl\") pod \"node-resolver-hjctd\" (UID: \"d1294e5e-31d1-48a2-8134-4d7b0f658d42\") " pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.706551 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.705767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9zb\" (UniqueName: \"kubernetes.io/projected/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-kube-api-access-4m9zb\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:06.707415 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.707396 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rtj9\" (UniqueName: \"kubernetes.io/projected/9a7f054c-e2d0-4250-be22-6160ebb37eec-kube-api-access-8rtj9\") pod \"multus-jl675\" (UID: \"9a7f054c-e2d0-4250-be22-6160ebb37eec\") " pod="openshift-multus/multus-jl675" Apr 22 18:36:06.708066 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.708045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvdg7\" (UniqueName: \"kubernetes.io/projected/3a24b441-1f95-45b3-b520-483d996f771f-kube-api-access-xvdg7\") pod \"multus-additional-cni-plugins-zw6cn\" (UID: \"3a24b441-1f95-45b3-b520-483d996f771f\") " pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.708375 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.708359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98m7l\" (UniqueName: \"kubernetes.io/projected/d8425f70-4f14-4d86-b30e-3abe38269764-kube-api-access-98m7l\") pod \"node-ca-tpthn\" (UID: \"d8425f70-4f14-4d86-b30e-3abe38269764\") " pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.732761 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.732727 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" event={"ID":"55d83b29c984d704f2c407ca2173be08","Type":"ContainerStarted","Data":"f2cf8b6f1d2d5ffd80cc9ecde8edb2110ea6b3d247af22053808269dd06bd535"} Apr 22 18:36:06.733616 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.733591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" event={"ID":"b50e7600b47f76668e274e014c99f3ac","Type":"ContainerStarted","Data":"06f84f2a8ee49366fa5447679422383991b700ba4039fa80e8584426a6f35082"} Apr 22 18:36:06.796500 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.796473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:06.802897 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.802876 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:06.802897 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.802897 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:06.803000 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.802908 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7lbrq for pod openshift-network-diagnostics/network-check-target-5h8ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:06.803000 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:06.802951 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq podName:516d0b19-b6db-46c2-9865-24e9c2e844fc nodeName:}" failed. No retries permitted until 2026-04-22 18:36:07.302939036 +0000 UTC m=+2.162107734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7lbrq" (UniqueName: "kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq") pod "network-check-target-5h8ps" (UID: "516d0b19-b6db-46c2-9865-24e9c2e844fc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:06.904238 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.904219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5vfs6" Apr 22 18:36:06.910189 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.910170 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06291473_0b0d_41e0_99f1_3d887d31c55e.slice/crio-d66eb305770ab111d77dd09647210962a7ce527bd6b723182a23ad24c3a17340 WatchSource:0}: Error finding container d66eb305770ab111d77dd09647210962a7ce527bd6b723182a23ad24c3a17340: Status 404 returned error can't find the container with id d66eb305770ab111d77dd09647210962a7ce527bd6b723182a23ad24c3a17340 Apr 22 18:36:06.923090 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.923068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hjctd" Apr 22 18:36:06.929153 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.929132 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1294e5e_31d1_48a2_8134_4d7b0f658d42.slice/crio-8c86d966f1f3c1aa2fc2ee842164b05aee240e395972e655f2ea14c6239c6d44 WatchSource:0}: Error finding container 8c86d966f1f3c1aa2fc2ee842164b05aee240e395972e655f2ea14c6239c6d44: Status 404 returned error can't find the container with id 8c86d966f1f3c1aa2fc2ee842164b05aee240e395972e655f2ea14c6239c6d44 Apr 22 18:36:06.938201 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.938187 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" Apr 22 18:36:06.942914 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.942896 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:06.943628 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.943610 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a24b441_1f95_45b3_b520_483d996f771f.slice/crio-6ef5431d996639338b905401d0ef41701ba0cbb7699bd0dd305d150111af3878 WatchSource:0}: Error finding container 6ef5431d996639338b905401d0ef41701ba0cbb7699bd0dd305d150111af3878: Status 404 returned error can't find the container with id 6ef5431d996639338b905401d0ef41701ba0cbb7699bd0dd305d150111af3878 Apr 22 18:36:06.950499 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.950483 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jl675" Apr 22 18:36:06.956072 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.956056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:06.958022 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.957990 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a7f054c_e2d0_4250_be22_6160ebb37eec.slice/crio-3eb028fd3c1387bedb43568ecef0aa472ef2d73929ed058ebb74c6e54edad838 WatchSource:0}: Error finding container 3eb028fd3c1387bedb43568ecef0aa472ef2d73929ed058ebb74c6e54edad838: Status 404 returned error can't find the container with id 3eb028fd3c1387bedb43568ecef0aa472ef2d73929ed058ebb74c6e54edad838 Apr 22 18:36:06.964413 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.964393 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a2a2c8_4324_4276_a6c1_57c3f81c4b5c.slice/crio-e4b245ea73e6108921114b87ff005e3b39d4bcd338cdd9a3735d627ae218775d WatchSource:0}: Error finding container e4b245ea73e6108921114b87ff005e3b39d4bcd338cdd9a3735d627ae218775d: Status 404 returned error can't find the container with id e4b245ea73e6108921114b87ff005e3b39d4bcd338cdd9a3735d627ae218775d Apr 22 18:36:06.975042 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.975022 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:06.979610 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.979590 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" Apr 22 18:36:06.981387 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.981361 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod915264b6_6df0_4100_9a03_985c5f546a4b.slice/crio-37c419de24b423467477f6a4b10dcf1143b078f6f6b44bd706b8cfb0426f0094 WatchSource:0}: Error finding container 37c419de24b423467477f6a4b10dcf1143b078f6f6b44bd706b8cfb0426f0094: Status 404 returned error can't find the container with id 37c419de24b423467477f6a4b10dcf1143b078f6f6b44bd706b8cfb0426f0094 Apr 22 18:36:06.985850 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.985831 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" Apr 22 18:36:06.987002 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.986982 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba489902_99d2_4aa8_afc6_aac5da21ebe8.slice/crio-5695146632ea3518b9ccb69a52593d5060f809f3208a10b0cc25665d0da5a411 WatchSource:0}: Error finding container 5695146632ea3518b9ccb69a52593d5060f809f3208a10b0cc25665d0da5a411: Status 404 returned error can't find the container with id 5695146632ea3518b9ccb69a52593d5060f809f3208a10b0cc25665d0da5a411 Apr 22 18:36:06.989403 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:06.989386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tpthn" Apr 22 18:36:06.993819 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.993791 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83267a92_55fb_45ae_8856_cfb92fa1ca05.slice/crio-10773293efd690f05e6286c37878cfbd2da957c52cd834c95eb52f44b5ec6a82 WatchSource:0}: Error finding container 10773293efd690f05e6286c37878cfbd2da957c52cd834c95eb52f44b5ec6a82: Status 404 returned error can't find the container with id 10773293efd690f05e6286c37878cfbd2da957c52cd834c95eb52f44b5ec6a82 Apr 22 18:36:06.996839 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:06.996814 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8425f70_4f14_4d86_b30e_3abe38269764.slice/crio-19ef58383747851c8f092c29471c6bcfc9afbf6e4d5dbd0e776903dc740bc0ff WatchSource:0}: Error finding container 19ef58383747851c8f092c29471c6bcfc9afbf6e4d5dbd0e776903dc740bc0ff: Status 404 returned error can't find the container with id 19ef58383747851c8f092c29471c6bcfc9afbf6e4d5dbd0e776903dc740bc0ff Apr 22 18:36:07.198535 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.198448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:07.198719 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:07.198612 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:07.198790 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:07.198719 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:08.198659679 +0000 UTC m=+3.057828384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:07.400367 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.400317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:07.400598 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:07.400574 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:07.400598 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:07.400606 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:07.400725 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:07.400618 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7lbrq for pod openshift-network-diagnostics/network-check-target-5h8ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:07.400725 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:07.400676 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq podName:516d0b19-b6db-46c2-9865-24e9c2e844fc nodeName:}" failed. No retries permitted until 2026-04-22 18:36:08.400656594 +0000 UTC m=+3.259825306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lbrq" (UniqueName: "kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq") pod "network-check-target-5h8ps" (UID: "516d0b19-b6db-46c2-9865-24e9c2e844fc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:07.465060 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.464987 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:07.659893 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.659801 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:06 +0000 UTC" deadline="2028-02-06 11:12:16.58154873 +0000 UTC" Apr 22 18:36:07.659893 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.659846 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15712h36m8.921707915s" Apr 22 18:36:07.737591 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.736974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:07.737591 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:07.737087 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:07.755361 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.755315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wlpch" event={"ID":"915264b6-6df0-4100-9a03-985c5f546a4b","Type":"ContainerStarted","Data":"37c419de24b423467477f6a4b10dcf1143b078f6f6b44bd706b8cfb0426f0094"} Apr 22 18:36:07.761291 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.761267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerStarted","Data":"6ef5431d996639338b905401d0ef41701ba0cbb7699bd0dd305d150111af3878"} Apr 22 18:36:07.768216 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.768179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hjctd" event={"ID":"d1294e5e-31d1-48a2-8134-4d7b0f658d42","Type":"ContainerStarted","Data":"8c86d966f1f3c1aa2fc2ee842164b05aee240e395972e655f2ea14c6239c6d44"} Apr 22 18:36:07.780273 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.780249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"e4b245ea73e6108921114b87ff005e3b39d4bcd338cdd9a3735d627ae218775d"} Apr 22 18:36:07.800689 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.800544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl675" event={"ID":"9a7f054c-e2d0-4250-be22-6160ebb37eec","Type":"ContainerStarted","Data":"3eb028fd3c1387bedb43568ecef0aa472ef2d73929ed058ebb74c6e54edad838"} Apr 22 18:36:07.818447 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.818392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5vfs6" event={"ID":"06291473-0b0d-41e0-99f1-3d887d31c55e","Type":"ContainerStarted","Data":"d66eb305770ab111d77dd09647210962a7ce527bd6b723182a23ad24c3a17340"} Apr 22 18:36:07.832388 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.832361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tpthn" event={"ID":"d8425f70-4f14-4d86-b30e-3abe38269764","Type":"ContainerStarted","Data":"19ef58383747851c8f092c29471c6bcfc9afbf6e4d5dbd0e776903dc740bc0ff"} Apr 22 18:36:07.842146 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.841949 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" event={"ID":"83267a92-55fb-45ae-8856-cfb92fa1ca05","Type":"ContainerStarted","Data":"10773293efd690f05e6286c37878cfbd2da957c52cd834c95eb52f44b5ec6a82"} Apr 22 18:36:07.860478 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:07.860304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" event={"ID":"ba489902-99d2-4aa8-afc6-aac5da21ebe8","Type":"ContainerStarted","Data":"5695146632ea3518b9ccb69a52593d5060f809f3208a10b0cc25665d0da5a411"} Apr 22 18:36:08.207653 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:08.207596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:08.207827 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:08.207765 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:08.207907 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:08.207833 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:10.207813171 +0000 UTC m=+5.066981874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:08.415397 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:08.414741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:08.415397 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:08.414927 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:08.415397 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:08.414947 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:08.415397 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:08.414960 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7lbrq for pod openshift-network-diagnostics/network-check-target-5h8ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:08.415397 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:08.415022 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq podName:516d0b19-b6db-46c2-9865-24e9c2e844fc nodeName:}" failed. No retries permitted until 2026-04-22 18:36:10.414998414 +0000 UTC m=+5.274167116 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lbrq" (UniqueName: "kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq") pod "network-check-target-5h8ps" (UID: "516d0b19-b6db-46c2-9865-24e9c2e844fc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:08.660199 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:08.660154 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:06 +0000 UTC" deadline="2028-02-03 23:00:28.887883762 +0000 UTC" Apr 22 18:36:08.660199 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:08.660196 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15652h24m20.227691375s" Apr 22 18:36:08.731089 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:08.730596 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:08.731089 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:08.730729 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:09.215546 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:09.215015 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:09.733081 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:09.732588 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:09.733081 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:09.732718 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:10.231526 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:10.231487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:10.231758 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:10.231631 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:10.231758 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:10.231709 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:14.231688254 +0000 UTC m=+9.090856965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:10.433756 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:10.433149 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:10.433756 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:10.433294 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:10.433756 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:10.433313 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:10.433756 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:10.433341 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7lbrq for pod openshift-network-diagnostics/network-check-target-5h8ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:10.433756 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:10.433401 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq podName:516d0b19-b6db-46c2-9865-24e9c2e844fc nodeName:}" failed. No retries permitted until 2026-04-22 18:36:14.433380468 +0000 UTC m=+9.292549184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lbrq" (UniqueName: "kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq") pod "network-check-target-5h8ps" (UID: "516d0b19-b6db-46c2-9865-24e9c2e844fc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:10.731482 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:10.730977 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:10.731482 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:10.731131 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:11.732005 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:11.731972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:11.732504 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:11.732111 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:12.731086 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:12.731051 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:12.731245 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:12.731191 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:13.731184 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:13.731151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:13.731717 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:13.731269 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:14.265010 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:14.264921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:14.265207 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:14.265085 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:14.265207 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:14.265158 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:22.265135775 +0000 UTC m=+17.124304491 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:14.466981 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:14.466893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:14.467163 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:14.467073 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:14.467163 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:14.467103 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:14.467163 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:14.467119 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7lbrq for pod openshift-network-diagnostics/network-check-target-5h8ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:14.467322 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:14.467186 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq podName:516d0b19-b6db-46c2-9865-24e9c2e844fc nodeName:}" failed. No retries permitted until 2026-04-22 18:36:22.467168006 +0000 UTC m=+17.326336706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lbrq" (UniqueName: "kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq") pod "network-check-target-5h8ps" (UID: "516d0b19-b6db-46c2-9865-24e9c2e844fc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:14.731342 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:14.730809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:14.731342 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:14.730950 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:15.732038 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:15.732004 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:15.732508 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:15.732151 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:16.730684 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:16.730652 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:16.730834 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:16.730764 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:17.731443 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:17.731411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:17.731850 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:17.731535 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:18.731316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:18.731286 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:18.731497 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:18.731414 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:19.731227 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:19.731192 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:19.731425 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:19.731296 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:20.730856 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:20.730826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:20.731238 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:20.730961 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:21.731581 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:21.731550 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:21.731959 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:21.731646 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:22.326731 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:22.326695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:22.326937 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:22.326838 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:22.327010 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:22.326920 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:38.326896611 +0000 UTC m=+33.186065318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:22.528508 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:22.528476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:22.528670 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:22.528647 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:22.528670 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:22.528670 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:22.528772 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:22.528680 2572 projected.go:194] Error preparing data for projected volume kube-api-access-7lbrq for pod openshift-network-diagnostics/network-check-target-5h8ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:22.528772 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:22.528727 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq podName:516d0b19-b6db-46c2-9865-24e9c2e844fc nodeName:}" failed. No retries permitted until 2026-04-22 18:36:38.528713064 +0000 UTC m=+33.387881766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lbrq" (UniqueName: "kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq") pod "network-check-target-5h8ps" (UID: "516d0b19-b6db-46c2-9865-24e9c2e844fc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:22.730525 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:22.730489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:22.730682 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:22.730614 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:23.730828 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:23.730792 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:23.731243 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:23.730926 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:24.730647 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:24.730625 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:24.731379 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:24.731349 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:25.731540 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.731323 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:25.732391 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:25.731657 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:25.900019 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.899838 2572 generic.go:358] "Generic (PLEG): container finished" podID="b50e7600b47f76668e274e014c99f3ac" containerID="74540a5be345d4efc2bb3a01f01a16c254a25fda60d100faba4102ea1d7e85b1" exitCode=0 Apr 22 18:36:25.900155 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.900049 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Apr 22 18:36:25.900155 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.899924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" event={"ID":"b50e7600b47f76668e274e014c99f3ac","Type":"ContainerDied","Data":"74540a5be345d4efc2bb3a01f01a16c254a25fda60d100faba4102ea1d7e85b1"} Apr 22 18:36:25.902384 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902367 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:36:25.902664 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902642 2572 generic.go:358] "Generic (PLEG): container finished" podID="62a2a2c8-4324-4276-a6c1-57c3f81c4b5c" containerID="eb62f47b76a2837385ff2995e8117c51cacb0e9613c428aa490c5f11a542060e" exitCode=1 Apr 22 18:36:25.902730 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"f29e5fd11433272ef3ffd3585fdf3bdff4fef32b9cc2334a076c1ccf116887c2"} Apr 22 18:36:25.902770 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"1bed2c38a388f179b511932461bce0bfb6b2c85042b47613031de4215025f8d5"} Apr 22 18:36:25.902770 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"5dfe38441530170d543f90f59e0489175353450bf2fa9171c560d3db88ed2112"} Apr 22 18:36:25.902770 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902756 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"ac7069d33bb785662f84e3a5a6d2b96ba884b5a2a4fc46f5de60d9612fca5fd8"} Apr 22 18:36:25.902770 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902763 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerDied","Data":"eb62f47b76a2837385ff2995e8117c51cacb0e9613c428aa490c5f11a542060e"} Apr 22 18:36:25.902931 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.902773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"585bdc87a1c4ff5533d26ec39ac8f0df18ad8c27f1070d8b398d5c1bed4cc154"} Apr 22 18:36:25.903697 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.903679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl675" event={"ID":"9a7f054c-e2d0-4250-be22-6160ebb37eec","Type":"ContainerStarted","Data":"8e06edadf1ea9e2009ebe04dfd01c0522d55f916c409deaacd4d6cfa80a23afc"} Apr 22 18:36:25.904829 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.904810 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" event={"ID":"55d83b29c984d704f2c407ca2173be08","Type":"ContainerStarted","Data":"0ca5d60107add56c85f2a1e0706390ab02f3fcf6322850e83fc161925106ddf3"} Apr 22 18:36:25.905932 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.905915 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tpthn" event={"ID":"d8425f70-4f14-4d86-b30e-3abe38269764","Type":"ContainerStarted","Data":"d4df10ae47de941fbc1b931533f200735c57281b42a8bd0be664f10122065e34"} Apr 22 18:36:25.907038 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.907020 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" event={"ID":"83267a92-55fb-45ae-8856-cfb92fa1ca05","Type":"ContainerStarted","Data":"5d76cfa6b4c13c871533f24dddcadaf9fb0c25664a1588666952aed28cf386fd"} Apr 22 18:36:25.908170 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.908146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" event={"ID":"ba489902-99d2-4aa8-afc6-aac5da21ebe8","Type":"ContainerStarted","Data":"5979271a07828f56bdf029113dc8164cfaaaf5bc2c7b9cc721244b02023f5715"} Apr 22 18:36:25.909280 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.909254 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wlpch" event={"ID":"915264b6-6df0-4100-9a03-985c5f546a4b","Type":"ContainerStarted","Data":"9d1a102b7216031cfcc8b6ce1f2b84940c07b0120f491005ede1541e024e6bd6"} Apr 22 18:36:25.909661 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.909634 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:25.910303 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.910278 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal"] Apr 22 18:36:25.910605 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.910585 2572 generic.go:358] "Generic (PLEG): container finished" podID="3a24b441-1f95-45b3-b520-483d996f771f" containerID="7c038e80c02bf51f52d6d36c7d4f2bc8085d3ae529fab3e37dde311f158f26a1" exitCode=0 Apr 22 18:36:25.910700 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.910614 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerDied","Data":"7c038e80c02bf51f52d6d36c7d4f2bc8085d3ae529fab3e37dde311f158f26a1"} Apr 22 18:36:25.911872 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.911810 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hjctd" event={"ID":"d1294e5e-31d1-48a2-8134-4d7b0f658d42","Type":"ContainerStarted","Data":"340bc7851dc0c59f472742a4295302690d62d4de81087b470fec3fd2ed8e7707"} Apr 22 18:36:25.922377 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.922318 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jl675" podStartSLOduration=2.7650959950000003 podStartE2EDuration="20.922303439s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.960087279 +0000 UTC m=+1.819255976" lastFinishedPulling="2026-04-22 18:36:25.117294707 +0000 UTC m=+19.976463420" observedRunningTime="2026-04-22 18:36:25.921265105 +0000 UTC m=+20.780433824" watchObservedRunningTime="2026-04-22 18:36:25.922303439 +0000 UTC m=+20.781472159" Apr 22 18:36:25.935866 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.935824 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wlpch" podStartSLOduration=3.221675896 podStartE2EDuration="20.935811113s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.983428176 +0000 UTC m=+1.842596874" lastFinishedPulling="2026-04-22 18:36:24.697563376 +0000 UTC m=+19.556732091" observedRunningTime="2026-04-22 18:36:25.935692524 +0000 UTC m=+20.794861246" watchObservedRunningTime="2026-04-22 18:36:25.935811113 +0000 UTC m=+20.794979836" Apr 22 18:36:25.951534 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.951490 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-p4kkw" podStartSLOduration=3.216548398 podStartE2EDuration="20.951474876s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.995612503 +0000 UTC m=+1.854781204" lastFinishedPulling="2026-04-22 18:36:24.730538977 +0000 UTC m=+19.589707682" observedRunningTime="2026-04-22 18:36:25.951015091 +0000 UTC m=+20.810183827" watchObservedRunningTime="2026-04-22 18:36:25.951474876 +0000 UTC m=+20.810643597" Apr 22 18:36:25.965503 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.965461 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tpthn" podStartSLOduration=3.235567607 podStartE2EDuration="20.965449802s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.99850046 +0000 UTC m=+1.857669163" lastFinishedPulling="2026-04-22 18:36:24.728382646 +0000 UTC m=+19.587551358" observedRunningTime="2026-04-22 18:36:25.965057852 +0000 UTC m=+20.824226594" watchObservedRunningTime="2026-04-22 18:36:25.965449802 +0000 UTC m=+20.824618548" Apr 22 18:36:25.995426 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:25.995384 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" podStartSLOduration=19.995370479 podStartE2EDuration="19.995370479s" podCreationTimestamp="2026-04-22 18:36:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:25.99500477 +0000 UTC m=+20.854173489" watchObservedRunningTime="2026-04-22 18:36:25.995370479 +0000 UTC m=+20.854539200" Apr 22 18:36:26.014010 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.013975 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hjctd" podStartSLOduration=3.246811759 podStartE2EDuration="21.013966304s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.930414224 +0000 UTC m=+1.789582922" lastFinishedPulling="2026-04-22 18:36:24.69756876 +0000 UTC m=+19.556737467" observedRunningTime="2026-04-22 18:36:26.013901335 +0000 UTC m=+20.873070055" watchObservedRunningTime="2026-04-22 18:36:26.013966304 +0000 UTC m=+20.873135023" Apr 22 18:36:26.590114 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.590086 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:36:26.666045 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.665944 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:36:26.590107435Z","UUID":"64d938ba-63e5-4d4f-9970-274488524a27","Handler":null,"Name":"","Endpoint":""} Apr 22 18:36:26.668572 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.668547 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:36:26.668794 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.668591 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:36:26.731056 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.730934 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:26.731056 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:26.731044 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:26.915374 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.915321 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" event={"ID":"b50e7600b47f76668e274e014c99f3ac","Type":"ContainerStarted","Data":"f15f9f3096cfa1e30b43c5ef7fc2516cb25cafeaa8310632582aaf8d2df8e1c0"} Apr 22 18:36:26.917047 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.917009 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5vfs6" event={"ID":"06291473-0b0d-41e0-99f1-3d887d31c55e","Type":"ContainerStarted","Data":"8af1206690bc70b7d0e2601cdb0443ad4d49b0acc77466a567d74a710329ec07"} Apr 22 18:36:26.919151 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.918905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" event={"ID":"ba489902-99d2-4aa8-afc6-aac5da21ebe8","Type":"ContainerStarted","Data":"fc9c03638553b8a3d219995c117de336ff4803d8d4ca484bf1239149f827c9b8"} Apr 22 18:36:26.931400 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.931355 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" podStartSLOduration=1.9313431269999999 podStartE2EDuration="1.931343127s" podCreationTimestamp="2026-04-22 18:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:26.931315821 +0000 UTC m=+21.790484541" watchObservedRunningTime="2026-04-22 18:36:26.931343127 +0000 UTC m=+21.790511840" Apr 22 18:36:26.945924 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:26.945877 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5vfs6" podStartSLOduration=4.125909392 podStartE2EDuration="21.945862038s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.911507241 +0000 UTC m=+1.770675938" lastFinishedPulling="2026-04-22 18:36:24.731459874 +0000 UTC m=+19.590628584" observedRunningTime="2026-04-22 18:36:26.94506194 +0000 UTC m=+21.804230662" watchObservedRunningTime="2026-04-22 18:36:26.945862038 +0000 UTC m=+21.805030762" Apr 22 18:36:27.731132 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:27.731050 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:27.731345 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:27.731168 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:27.922303 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:27.922268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" event={"ID":"ba489902-99d2-4aa8-afc6-aac5da21ebe8","Type":"ContainerStarted","Data":"f17f728caf95011242f98228a6bccb743a3d4541942ed12955f81bd651481e56"} Apr 22 18:36:27.925268 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:27.925244 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:36:27.925659 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:27.925624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"e8abab178a3271b6a58cbb577b2537adab04fddd618253ba631558bc8e8454be"} Apr 22 18:36:27.940930 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:27.940877 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hq44l" podStartSLOduration=2.544947016 podStartE2EDuration="22.940859783s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.989266102 +0000 UTC m=+1.848434799" lastFinishedPulling="2026-04-22 18:36:27.385178864 +0000 UTC m=+22.244347566" observedRunningTime="2026-04-22 18:36:27.940037698 +0000 UTC m=+22.799206418" watchObservedRunningTime="2026-04-22 18:36:27.940859783 +0000 UTC m=+22.800028486" Apr 22 18:36:28.730682 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:28.730501 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:28.730869 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:28.730784 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:29.731367 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:29.731322 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:29.731801 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:29.731448 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:30.389585 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.389552 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:30.390170 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.390149 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:30.731035 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.730957 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:30.731179 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:30.731061 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:30.931581 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.931543 2572 generic.go:358] "Generic (PLEG): container finished" podID="3a24b441-1f95-45b3-b520-483d996f771f" containerID="1f11411924e0f5c57850b72074dab5e6a0c827423ce90f1fe0d5744f4d1e931b" exitCode=0 Apr 22 18:36:30.931963 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.931639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerDied","Data":"1f11411924e0f5c57850b72074dab5e6a0c827423ce90f1fe0d5744f4d1e931b"} Apr 22 18:36:30.934724 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.934701 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:36:30.934975 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.934958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"db846cc46e8d321094a8e004392ad8742bd9e28064eda033e268a6a3e1dded13"} Apr 22 18:36:30.935279 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.935261 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:30.935381 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.935363 2572 scope.go:117] "RemoveContainer" containerID="eb62f47b76a2837385ff2995e8117c51cacb0e9613c428aa490c5f11a542060e" Apr 22 18:36:30.950953 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:30.950931 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:31.730973 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.730949 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:31.731067 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:31.731037 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:31.927314 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.927250 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k7crw"] Apr 22 18:36:31.927452 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.927378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:31.927494 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:31.927478 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:31.930324 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.930305 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5h8ps"] Apr 22 18:36:31.938420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.938393 2572 generic.go:358] "Generic (PLEG): container finished" podID="3a24b441-1f95-45b3-b520-483d996f771f" containerID="2df4c1dea9d2f17a44fd5ac28af3a490d66cd8354d76eaab8b6b1e75c061c262" exitCode=0 Apr 22 18:36:31.939083 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.938451 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerDied","Data":"2df4c1dea9d2f17a44fd5ac28af3a490d66cd8354d76eaab8b6b1e75c061c262"} Apr 22 18:36:31.941904 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.941886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:36:31.942270 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.942252 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:31.942379 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.942284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" event={"ID":"62a2a2c8-4324-4276-a6c1-57c3f81c4b5c","Type":"ContainerStarted","Data":"1c7c80c74d3311125fec56467861df60d70621617af0cdb94aaa223c7ae997f9"} Apr 22 18:36:31.942379 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:31.942342 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:31.942488 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.942464 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:31.942739 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.942720 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:31.957739 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.957714 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:31.990446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:31.990412 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" podStartSLOduration=9.212689706 podStartE2EDuration="26.990401379s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.966083124 +0000 UTC m=+1.825251827" lastFinishedPulling="2026-04-22 18:36:24.743794796 +0000 UTC m=+19.602963500" observedRunningTime="2026-04-22 18:36:31.989084448 +0000 UTC m=+26.848253167" watchObservedRunningTime="2026-04-22 18:36:31.990401379 +0000 UTC m=+26.849570144" Apr 22 18:36:32.945359 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:32.945128 2572 generic.go:358] "Generic (PLEG): container finished" podID="3a24b441-1f95-45b3-b520-483d996f771f" containerID="957ad83c13029f93ea3adcf966d2c526f062c79843811728f819743a79040af4" exitCode=0 Apr 22 18:36:32.945685 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:32.945211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerDied","Data":"957ad83c13029f93ea3adcf966d2c526f062c79843811728f819743a79040af4"} Apr 22 18:36:32.945685 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:32.945529 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:33.731074 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:33.731043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:33.731236 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:33.731086 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:33.731236 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:33.731216 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:33.731359 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:33.731316 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:33.948031 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:33.948006 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:35.288204 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.288075 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:36:35.288637 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.288291 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:35.305257 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.305205 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" podUID="62a2a2c8-4324-4276-a6c1-57c3f81c4b5c" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 18:36:35.314025 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.313982 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" podUID="62a2a2c8-4324-4276-a6c1-57c3f81c4b5c" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 18:36:35.731838 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.731803 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:35.732012 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.731897 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:35.732012 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:35.731975 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:36:35.732135 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:35.732013 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5h8ps" podUID="516d0b19-b6db-46c2-9865-24e9c2e844fc" Apr 22 18:36:35.924069 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.924036 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:35.924224 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.924187 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:35.924761 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:35.924741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wlpch" Apr 22 18:36:36.974824 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:36.974755 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeReady" Apr 22 18:36:36.975225 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:36.974887 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:36:37.033800 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.033768 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ntjbb"] Apr 22 18:36:37.071076 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.071045 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2phz6"] Apr 22 18:36:37.071216 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.071199 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.077632 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.077565 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:36:37.077632 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.077601 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4wrcz\"" Apr 22 18:36:37.077869 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.077846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:36:37.095338 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.095316 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ntjbb"] Apr 22 18:36:37.095446 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.095351 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2phz6"] Apr 22 18:36:37.095500 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.095442 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:37.097831 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.097812 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:36:37.097916 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.097844 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:36:37.097916 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.097889 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:36:37.097916 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.097898 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qw5s\"" Apr 22 18:36:37.237187 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.237112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:37.237187 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.237176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57975d7c-6756-4dde-9d27-faa3e96cc6f5-config-volume\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.237402 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.237217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57975d7c-6756-4dde-9d27-faa3e96cc6f5-tmp-dir\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.237402 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.237236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.237402 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.237272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zclv5\" (UniqueName: \"kubernetes.io/projected/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-kube-api-access-zclv5\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:37.237402 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.237377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fwdj\" (UniqueName: \"kubernetes.io/projected/57975d7c-6756-4dde-9d27-faa3e96cc6f5-kube-api-access-4fwdj\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.338008 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.337974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57975d7c-6756-4dde-9d27-faa3e96cc6f5-config-volume\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.338167 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.338019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57975d7c-6756-4dde-9d27-faa3e96cc6f5-tmp-dir\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.338167 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.338053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.338167 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.338082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zclv5\" (UniqueName: \"kubernetes.io/projected/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-kube-api-access-zclv5\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:37.338167 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.338116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fwdj\" (UniqueName: \"kubernetes.io/projected/57975d7c-6756-4dde-9d27-faa3e96cc6f5-kube-api-access-4fwdj\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.338413 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.338213 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:37.338413 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.338279 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:37.338413 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.338212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:37.338413 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.338285 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:37.838264706 +0000 UTC m=+32.697433409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:36:37.338413 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.338369 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:37.838350281 +0000 UTC m=+32.697518993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:36:37.338679 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.338657 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57975d7c-6756-4dde-9d27-faa3e96cc6f5-tmp-dir\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.338679 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.338659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57975d7c-6756-4dde-9d27-faa3e96cc6f5-config-volume\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.348793 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.348772 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fwdj\" (UniqueName: \"kubernetes.io/projected/57975d7c-6756-4dde-9d27-faa3e96cc6f5-kube-api-access-4fwdj\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.349089 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.349068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zclv5\" (UniqueName: \"kubernetes.io/projected/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-kube-api-access-zclv5\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:37.731049 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.731010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:37.731512 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.731220 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:37.733997 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.733974 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:36:37.734944 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.734925 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vgxpp\"" Apr 22 18:36:37.734944 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.734939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfh6w\"" Apr 22 18:36:37.735100 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.735039 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:36:37.735100 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.735043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:36:37.842216 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.842187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:37.842376 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.842314 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:37.842444 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:37.842398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:37.842517 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.842483 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:38.84245573 +0000 UTC m=+33.701624436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:36:37.842574 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.842528 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:37.842623 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:37.842577 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:38.842561015 +0000 UTC m=+33.701729719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:36:38.346924 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:38.346886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:36:38.347431 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:38.347040 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:36:38.347431 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:38.347111 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:10.347092699 +0000 UTC m=+65.206261397 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : secret "metrics-daemon-secret" not found Apr 22 18:36:38.548655 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:38.548621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:38.551275 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:38.551238 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbrq\" (UniqueName: \"kubernetes.io/projected/516d0b19-b6db-46c2-9865-24e9c2e844fc-kube-api-access-7lbrq\") pod \"network-check-target-5h8ps\" (UID: \"516d0b19-b6db-46c2-9865-24e9c2e844fc\") " pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:38.648763 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:38.648683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:38.850303 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:38.850279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:38.850432 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:38.850358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:38.850495 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:38.850453 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:38.850495 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:38.850461 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:38.850581 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:38.850521 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:40.850501372 +0000 UTC m=+35.709670071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:36:38.850581 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:38.850539 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:40.850530802 +0000 UTC m=+35.709699504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:36:38.990107 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:38.989988 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5h8ps"] Apr 22 18:36:38.993196 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:36:38.993172 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod516d0b19_b6db_46c2_9865_24e9c2e844fc.slice/crio-c782f2174ac9824ad948c510690f0ef452a87d93b10f1d170ac87a30c61fd9cf WatchSource:0}: Error finding container c782f2174ac9824ad948c510690f0ef452a87d93b10f1d170ac87a30c61fd9cf: Status 404 returned error can't find the container with id c782f2174ac9824ad948c510690f0ef452a87d93b10f1d170ac87a30c61fd9cf Apr 22 18:36:39.961722 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:39.961681 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5h8ps" event={"ID":"516d0b19-b6db-46c2-9865-24e9c2e844fc","Type":"ContainerStarted","Data":"c782f2174ac9824ad948c510690f0ef452a87d93b10f1d170ac87a30c61fd9cf"} Apr 22 18:36:39.964371 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:39.964325 2572 generic.go:358] "Generic (PLEG): container finished" podID="3a24b441-1f95-45b3-b520-483d996f771f" containerID="212c8c3a6191b0f423685f47b60608f62200c9ad5fd029e72e9d8db535e6e5b6" exitCode=0 Apr 22 18:36:39.964497 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:39.964378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerDied","Data":"212c8c3a6191b0f423685f47b60608f62200c9ad5fd029e72e9d8db535e6e5b6"} Apr 22 18:36:40.863657 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:40.863621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:40.863844 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:40.863708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:40.863844 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:40.863767 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:40.863844 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:40.863785 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:40.863844 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:40.863841 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:44.863822017 +0000 UTC m=+39.722990715 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:36:40.864001 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:40.863858 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:44.863850381 +0000 UTC m=+39.723019078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:36:40.969023 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:40.968991 2572 generic.go:358] "Generic (PLEG): container finished" podID="3a24b441-1f95-45b3-b520-483d996f771f" containerID="1e2817f6e3b9f203c272c12a8c2861977d41f1a8bd4cea094f039f4cbddea4a0" exitCode=0 Apr 22 18:36:40.969614 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:40.969059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerDied","Data":"1e2817f6e3b9f203c272c12a8c2861977d41f1a8bd4cea094f039f4cbddea4a0"} Apr 22 18:36:41.975066 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:41.975029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" event={"ID":"3a24b441-1f95-45b3-b520-483d996f771f","Type":"ContainerStarted","Data":"82561925767ee9b7854ab332c0b4f326b00aa32e6ef31a6f1a79fbacd7f8d162"} Apr 22 18:36:42.001013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:42.000954 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zw6cn" podStartSLOduration=5.09545071 podStartE2EDuration="37.000938549s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:06.945119018 +0000 UTC m=+1.804287716" lastFinishedPulling="2026-04-22 18:36:38.85060684 +0000 UTC m=+33.709775555" observedRunningTime="2026-04-22 18:36:42.000533309 +0000 UTC m=+36.859702099" watchObservedRunningTime="2026-04-22 18:36:42.000938549 +0000 UTC m=+36.860107280" Apr 22 18:36:42.978816 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:42.978644 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5h8ps" event={"ID":"516d0b19-b6db-46c2-9865-24e9c2e844fc","Type":"ContainerStarted","Data":"5e13f7d324b5968ba91e7efd3d5ba9118f51d53b9820c4d4f176cb73f746ad01"} Apr 22 18:36:42.979174 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:42.979161 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:36:44.893712 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:44.893673 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:44.894088 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:44.893737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:44.894088 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:44.893820 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:44.894088 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:44.893883 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:52.893868083 +0000 UTC m=+47.753036780 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:36:44.894088 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:44.893820 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:44.894088 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:44.893954 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:52.893941034 +0000 UTC m=+47.753109737 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:36:52.942438 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:52.942406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:36:52.942786 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:36:52.942451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:36:52.942786 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:52.942543 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:52.942786 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:52.942545 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:52.942786 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:52.942598 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:08.942580835 +0000 UTC m=+63.801749536 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:36:52.942786 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:36:52.942612 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:08.942605885 +0000 UTC m=+63.801774583 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:37:05.314701 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:05.314667 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vjgnk" Apr 22 18:37:05.343300 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:05.343256 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5h8ps" podStartSLOduration=56.931638865 podStartE2EDuration="1m0.34324334s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:38.995150449 +0000 UTC m=+33.854319152" lastFinishedPulling="2026-04-22 18:36:42.406754925 +0000 UTC m=+37.265923627" observedRunningTime="2026-04-22 18:36:42.995442012 +0000 UTC m=+37.854610742" watchObservedRunningTime="2026-04-22 18:37:05.34324334 +0000 UTC m=+60.202412090" Apr 22 18:37:08.946134 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:08.946103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:37:08.946522 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:08.946148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:37:08.946522 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:08.946240 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:08.946522 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:08.946244 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:08.946522 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:08.946303 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:40.946284737 +0000 UTC m=+95.805453437 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:37:08.946522 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:08.946318 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:40.946311749 +0000 UTC m=+95.805480446 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:37:10.355965 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:10.355919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:37:10.356359 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:10.356038 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:37:10.356359 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:10.356090 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:14.356076652 +0000 UTC m=+129.215245350 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : secret "metrics-daemon-secret" not found Apr 22 18:37:14.985393 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:14.985356 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5h8ps" Apr 22 18:37:40.953473 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:40.953361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:37:40.953473 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:37:40.953429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:37:40.953939 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:40.953482 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:40.953939 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:40.953525 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:40.953939 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:40.953543 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert podName:f0ecf33d-061b-4ba1-9f1e-ec8f458b1027 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:44.953526844 +0000 UTC m=+159.812695541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert") pod "ingress-canary-2phz6" (UID: "f0ecf33d-061b-4ba1-9f1e-ec8f458b1027") : secret "canary-serving-cert" not found Apr 22 18:37:40.953939 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:37:40.953577 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls podName:57975d7c-6756-4dde-9d27-faa3e96cc6f5 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:44.953565409 +0000 UTC m=+159.812734106 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls") pod "dns-default-ntjbb" (UID: "57975d7c-6756-4dde-9d27-faa3e96cc6f5") : secret "dns-default-metrics-tls" not found Apr 22 18:38:04.448714 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.448675 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp"] Apr 22 18:38:04.451455 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.451437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" Apr 22 18:38:04.453932 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.453913 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.454045 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.453951 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.454770 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.454752 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-k264q\"" Apr 22 18:38:04.462927 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.462908 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp"] Apr 22 18:38:04.514556 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.514513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r65zz\" (UniqueName: \"kubernetes.io/projected/7e46d4ab-f18c-4fbb-b659-be241b0d7c69-kube-api-access-r65zz\") pod \"volume-data-source-validator-7c6cbb6c87-8z4qp\" (UID: \"7e46d4ab-f18c-4fbb-b659-be241b0d7c69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" Apr 22 18:38:04.552091 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.552054 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gs4mb"] Apr 22 18:38:04.554791 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.554764 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6brbj"] Apr 22 18:38:04.554906 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.554836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.557494 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.557469 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-hwc6w\"" Apr 22 18:38:04.557852 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.557469 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:38:04.557852 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.557480 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.557852 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.557641 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:38:04.558136 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.558118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.558204 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.558139 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.560768 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.560718 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.560768 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.560720 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:38:04.560937 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.560925 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.561166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.561144 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:38:04.561237 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.561206 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-r55g9\"" Apr 22 18:38:04.564432 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.564413 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:38:04.566177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.566161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:38:04.573201 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.571127 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gs4mb"] Apr 22 18:38:04.573201 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.571948 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6brbj"] Apr 22 18:38:04.615677 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615639 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-tmp\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.615677 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-snapshots\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.615913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-serving-cert\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.615913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60e694f5-420a-4a93-b793-b951e02e4c81-trusted-ca\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.615913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60e694f5-420a-4a93-b793-b951e02e4c81-serving-cert\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.615913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e694f5-420a-4a93-b793-b951e02e4c81-config\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.615913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r65zz\" (UniqueName: \"kubernetes.io/projected/7e46d4ab-f18c-4fbb-b659-be241b0d7c69-kube-api-access-r65zz\") pod \"volume-data-source-validator-7c6cbb6c87-8z4qp\" (UID: \"7e46d4ab-f18c-4fbb-b659-be241b0d7c69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" Apr 22 18:38:04.615913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.616089 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.615919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc9qw\" (UniqueName: \"kubernetes.io/projected/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-kube-api-access-zc9qw\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.616089 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.616026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-service-ca-bundle\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.616089 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.616046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2wb7\" (UniqueName: \"kubernetes.io/projected/60e694f5-420a-4a93-b793-b951e02e4c81-kube-api-access-f2wb7\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.625538 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.625509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r65zz\" (UniqueName: \"kubernetes.io/projected/7e46d4ab-f18c-4fbb-b659-be241b0d7c69-kube-api-access-r65zz\") pod \"volume-data-source-validator-7c6cbb6c87-8z4qp\" (UID: \"7e46d4ab-f18c-4fbb-b659-be241b0d7c69\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" Apr 22 18:38:04.651648 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.651615 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr"] Apr 22 18:38:04.654591 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.654570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.655532 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.655513 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-69c74fc656-2f57x"] Apr 22 18:38:04.657526 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.657508 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:38:04.657612 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.657546 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-v7pfp\"" Apr 22 18:38:04.657950 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.657934 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.657990 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.657935 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.658155 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.658141 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.658197 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.658160 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:38:04.660510 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.660490 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.660755 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.660740 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:38:04.660872 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.660850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:38:04.660984 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.660935 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:38:04.661089 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.661074 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:38:04.661142 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.661110 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jjqxb\"" Apr 22 18:38:04.661241 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.661227 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.664216 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.664192 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr"] Apr 22 18:38:04.680257 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.680230 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69c74fc656-2f57x"] Apr 22 18:38:04.716858 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60e694f5-420a-4a93-b793-b951e02e4c81-serving-cert\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.716858 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89183697-99ab-489f-95ef-9654164feac8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.716858 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e694f5-420a-4a93-b793-b951e02e4c81-config\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.716858 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpqcf\" (UniqueName: \"kubernetes.io/projected/1567a865-78f8-433b-a4dd-e7478597180f-kube-api-access-hpqcf\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.717166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.717166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc9qw\" (UniqueName: \"kubernetes.io/projected/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-kube-api-access-zc9qw\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.717166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-stats-auth\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.717166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.716961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-service-ca-bundle\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.717166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2wb7\" (UniqueName: \"kubernetes.io/projected/60e694f5-420a-4a93-b793-b951e02e4c81-kube-api-access-f2wb7\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.717166 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-default-certificate\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89183697-99ab-489f-95ef-9654164feac8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2hmp\" (UniqueName: \"kubernetes.io/projected/89183697-99ab-489f-95ef-9654164feac8-kube-api-access-n2hmp\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-tmp\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-snapshots\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-serving-cert\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60e694f5-420a-4a93-b793-b951e02e4c81-trusted-ca\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.717517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717423 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.717865 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717639 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-service-ca-bundle\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.717865 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717675 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e694f5-420a-4a93-b793-b951e02e4c81-config\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.717935 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.718003 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.717973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-snapshots\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.718242 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.718225 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60e694f5-420a-4a93-b793-b951e02e4c81-trusted-ca\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.718436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.718417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-tmp\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.719389 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.719369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60e694f5-420a-4a93-b793-b951e02e4c81-serving-cert\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.719678 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.719662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-serving-cert\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.725953 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.725922 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc9qw\" (UniqueName: \"kubernetes.io/projected/e949ed90-0ea2-43e9-8cbc-ae1bec9390c9-kube-api-access-zc9qw\") pod \"insights-operator-585dfdc468-gs4mb\" (UID: \"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9\") " pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.726046 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.725926 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2wb7\" (UniqueName: \"kubernetes.io/projected/60e694f5-420a-4a93-b793-b951e02e4c81-kube-api-access-f2wb7\") pod \"console-operator-9d4b6777b-6brbj\" (UID: \"60e694f5-420a-4a93-b793-b951e02e4c81\") " pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.760059 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.760019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" Apr 22 18:38:04.818275 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-stats-auth\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.818445 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-default-certificate\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.818445 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.818445 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89183697-99ab-489f-95ef-9654164feac8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.818445 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2hmp\" (UniqueName: \"kubernetes.io/projected/89183697-99ab-489f-95ef-9654164feac8-kube-api-access-n2hmp\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.818663 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.818663 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89183697-99ab-489f-95ef-9654164feac8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.818663 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.818566 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpqcf\" (UniqueName: \"kubernetes.io/projected/1567a865-78f8-433b-a4dd-e7478597180f-kube-api-access-hpqcf\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.819417 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:04.819192 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:04.819417 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:04.819277 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:05.319255521 +0000 UTC m=+120.178424230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : secret "router-metrics-certs-default" not found Apr 22 18:38:04.819417 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:04.819380 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:05.319357509 +0000 UTC m=+120.178526230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:04.820240 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.820191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89183697-99ab-489f-95ef-9654164feac8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.821863 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.821822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89183697-99ab-489f-95ef-9654164feac8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.822189 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.822141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-default-certificate\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.822792 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.822768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-stats-auth\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.827255 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.827217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpqcf\" (UniqueName: \"kubernetes.io/projected/1567a865-78f8-433b-a4dd-e7478597180f-kube-api-access-hpqcf\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:04.827376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.827355 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2hmp\" (UniqueName: \"kubernetes.io/projected/89183697-99ab-489f-95ef-9654164feac8-kube-api-access-n2hmp\") pod \"kube-storage-version-migrator-operator-6769c5d45-k7ljr\" (UID: \"89183697-99ab-489f-95ef-9654164feac8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.864624 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.864594 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gs4mb" Apr 22 18:38:04.870407 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.870378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:04.874297 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.874236 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp"] Apr 22 18:38:04.879048 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:04.878995 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e46d4ab_f18c_4fbb_b659_be241b0d7c69.slice/crio-42cfb0c03e0d55c82d1b7f9f7e57cd2e58f0a7d47e042c29b956b7a1a1742dc4 WatchSource:0}: Error finding container 42cfb0c03e0d55c82d1b7f9f7e57cd2e58f0a7d47e042c29b956b7a1a1742dc4: Status 404 returned error can't find the container with id 42cfb0c03e0d55c82d1b7f9f7e57cd2e58f0a7d47e042c29b956b7a1a1742dc4 Apr 22 18:38:04.965672 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.965645 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" Apr 22 18:38:04.985953 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.985923 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gs4mb"] Apr 22 18:38:04.989266 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:04.989238 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode949ed90_0ea2_43e9_8cbc_ae1bec9390c9.slice/crio-d20a44be0f613a25b0c4466d0cab468e35d4da8883e46ed8336702871e6b0321 WatchSource:0}: Error finding container d20a44be0f613a25b0c4466d0cab468e35d4da8883e46ed8336702871e6b0321: Status 404 returned error can't find the container with id d20a44be0f613a25b0c4466d0cab468e35d4da8883e46ed8336702871e6b0321 Apr 22 18:38:04.998621 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:04.998597 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6brbj"] Apr 22 18:38:05.001649 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:05.001624 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e694f5_420a_4a93_b793_b951e02e4c81.slice/crio-4733a232076e3932cecc3f1bbc6b5849845d9e96b57acece6ad9a7ebb502dfc8 WatchSource:0}: Error finding container 4733a232076e3932cecc3f1bbc6b5849845d9e96b57acece6ad9a7ebb502dfc8: Status 404 returned error can't find the container with id 4733a232076e3932cecc3f1bbc6b5849845d9e96b57acece6ad9a7ebb502dfc8 Apr 22 18:38:05.079563 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:05.079531 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr"] Apr 22 18:38:05.082777 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:05.082735 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89183697_99ab_489f_95ef_9654164feac8.slice/crio-2355069c3ba820616e2ef46d41d83db3ccb09cc3c6d673432697e697be6574da WatchSource:0}: Error finding container 2355069c3ba820616e2ef46d41d83db3ccb09cc3c6d673432697e697be6574da: Status 404 returned error can't find the container with id 2355069c3ba820616e2ef46d41d83db3ccb09cc3c6d673432697e697be6574da Apr 22 18:38:05.124318 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:05.124284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" event={"ID":"60e694f5-420a-4a93-b793-b951e02e4c81","Type":"ContainerStarted","Data":"4733a232076e3932cecc3f1bbc6b5849845d9e96b57acece6ad9a7ebb502dfc8"} Apr 22 18:38:05.125241 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:05.125217 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" event={"ID":"89183697-99ab-489f-95ef-9654164feac8","Type":"ContainerStarted","Data":"2355069c3ba820616e2ef46d41d83db3ccb09cc3c6d673432697e697be6574da"} Apr 22 18:38:05.126248 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:05.126228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gs4mb" event={"ID":"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9","Type":"ContainerStarted","Data":"d20a44be0f613a25b0c4466d0cab468e35d4da8883e46ed8336702871e6b0321"} Apr 22 18:38:05.127144 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:05.127114 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" event={"ID":"7e46d4ab-f18c-4fbb-b659-be241b0d7c69","Type":"ContainerStarted","Data":"42cfb0c03e0d55c82d1b7f9f7e57cd2e58f0a7d47e042c29b956b7a1a1742dc4"} Apr 22 18:38:05.321281 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:05.321195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:05.321281 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:05.321235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:05.321472 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:05.321360 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:06.321343314 +0000 UTC m=+121.180512011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:05.321472 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:05.321410 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:05.321472 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:05.321463 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:06.321451687 +0000 UTC m=+121.180620389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : secret "router-metrics-certs-default" not found Apr 22 18:38:06.329768 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:06.329721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:06.330282 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:06.329795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:06.330282 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:06.329921 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:08.32989921 +0000 UTC m=+123.189067919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:06.330282 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:06.329998 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:06.330282 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:06.330054 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:08.330038954 +0000 UTC m=+123.189207655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : secret "router-metrics-certs-default" not found Apr 22 18:38:08.349171 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:08.349132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:08.349537 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:08.349189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:08.349537 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:08.349287 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:12.349270106 +0000 UTC m=+127.208438803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:08.349537 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:08.349344 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:08.349537 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:08.349401 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:12.349385454 +0000 UTC m=+127.208554155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : secret "router-metrics-certs-default" not found Apr 22 18:38:09.138597 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.138571 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/0.log" Apr 22 18:38:09.138771 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.138608 2572 generic.go:358] "Generic (PLEG): container finished" podID="60e694f5-420a-4a93-b793-b951e02e4c81" containerID="8f8e18d23a3c84dd461ac0dd808d8febfab1b086a81861615afaee4b46d9487c" exitCode=255 Apr 22 18:38:09.138771 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.138676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" event={"ID":"60e694f5-420a-4a93-b793-b951e02e4c81","Type":"ContainerDied","Data":"8f8e18d23a3c84dd461ac0dd808d8febfab1b086a81861615afaee4b46d9487c"} Apr 22 18:38:09.138988 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.138965 2572 scope.go:117] "RemoveContainer" containerID="8f8e18d23a3c84dd461ac0dd808d8febfab1b086a81861615afaee4b46d9487c" Apr 22 18:38:09.140109 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.140085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" event={"ID":"89183697-99ab-489f-95ef-9654164feac8","Type":"ContainerStarted","Data":"b195bd4cac6e970ce477b13bddcb794935fb3557288608b2b3041cba3573e04b"} Apr 22 18:38:09.141491 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.141465 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gs4mb" event={"ID":"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9","Type":"ContainerStarted","Data":"3dbdc280d87418e46bf1d5e462c5aaa2a406e5ba640ee02d8abe00a7b159b402"} Apr 22 18:38:09.142782 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.142762 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" event={"ID":"7e46d4ab-f18c-4fbb-b659-be241b0d7c69","Type":"ContainerStarted","Data":"0de93584dd8225153ade8140d420e94ae58923ebed88d7999e7cc6d34c418092"} Apr 22 18:38:09.194898 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.194851 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8z4qp" podStartSLOduration=2.028658421 podStartE2EDuration="5.194836675s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="2026-04-22 18:38:04.880659926 +0000 UTC m=+119.739828624" lastFinishedPulling="2026-04-22 18:38:08.04683818 +0000 UTC m=+122.906006878" observedRunningTime="2026-04-22 18:38:09.194045251 +0000 UTC m=+124.053213970" watchObservedRunningTime="2026-04-22 18:38:09.194836675 +0000 UTC m=+124.054005393" Apr 22 18:38:09.216727 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.216673 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" podStartSLOduration=2.254396447 podStartE2EDuration="5.216659664s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="2026-04-22 18:38:05.08454963 +0000 UTC m=+119.943718327" lastFinishedPulling="2026-04-22 18:38:08.046812827 +0000 UTC m=+122.905981544" observedRunningTime="2026-04-22 18:38:09.215616219 +0000 UTC m=+124.074784952" watchObservedRunningTime="2026-04-22 18:38:09.216659664 +0000 UTC m=+124.075828390" Apr 22 18:38:09.238491 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:09.238430 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-gs4mb" podStartSLOduration=2.182749509 podStartE2EDuration="5.238413046s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="2026-04-22 18:38:04.991147961 +0000 UTC m=+119.850316659" lastFinishedPulling="2026-04-22 18:38:08.046811499 +0000 UTC m=+122.905980196" observedRunningTime="2026-04-22 18:38:09.237687505 +0000 UTC m=+124.096856234" watchObservedRunningTime="2026-04-22 18:38:09.238413046 +0000 UTC m=+124.097581766" Apr 22 18:38:10.146764 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.146738 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/1.log" Apr 22 18:38:10.147262 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.147060 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/0.log" Apr 22 18:38:10.147262 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.147086 2572 generic.go:358] "Generic (PLEG): container finished" podID="60e694f5-420a-4a93-b793-b951e02e4c81" containerID="9686d85f36038a015886de9749809411779a863caf30fc4a449546482577e6ec" exitCode=255 Apr 22 18:38:10.147262 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.147192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" event={"ID":"60e694f5-420a-4a93-b793-b951e02e4c81","Type":"ContainerDied","Data":"9686d85f36038a015886de9749809411779a863caf30fc4a449546482577e6ec"} Apr 22 18:38:10.147262 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.147237 2572 scope.go:117] "RemoveContainer" containerID="8f8e18d23a3c84dd461ac0dd808d8febfab1b086a81861615afaee4b46d9487c" Apr 22 18:38:10.147460 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.147437 2572 scope.go:117] "RemoveContainer" containerID="9686d85f36038a015886de9749809411779a863caf30fc4a449546482577e6ec" Apr 22 18:38:10.147656 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:10.147630 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6brbj_openshift-console-operator(60e694f5-420a-4a93-b793-b951e02e4c81)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" podUID="60e694f5-420a-4a93-b793-b951e02e4c81" Apr 22 18:38:10.512564 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.512475 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zr7k2"] Apr 22 18:38:10.516468 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.516442 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.519111 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.519090 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:38:10.519233 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.519122 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:38:10.519914 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.519897 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:38:10.520013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.519897 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:38:10.520013 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.519898 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-kt45l\"" Apr 22 18:38:10.524136 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.524114 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zr7k2"] Apr 22 18:38:10.672816 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.672783 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-signing-key\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.672816 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.672818 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-signing-cabundle\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.673026 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.672876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9w7\" (UniqueName: \"kubernetes.io/projected/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-kube-api-access-hg9w7\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.774066 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.773977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-signing-key\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.774066 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.774018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-signing-cabundle\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.774066 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.774044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9w7\" (UniqueName: \"kubernetes.io/projected/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-kube-api-access-hg9w7\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.774810 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.774790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-signing-cabundle\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.776406 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.776390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-signing-key\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.783663 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.783640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9w7\" (UniqueName: \"kubernetes.io/projected/c6879dd6-49f5-4be9-a8db-3a753d7b6b9a-kube-api-access-hg9w7\") pod \"service-ca-865cb79987-zr7k2\" (UID: \"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a\") " pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.825618 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.825579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zr7k2" Apr 22 18:38:10.946362 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:10.946311 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zr7k2"] Apr 22 18:38:10.949584 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:10.949556 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6879dd6_49f5_4be9_a8db_3a753d7b6b9a.slice/crio-8dfd50bba733d9fca2880071d499ed9e1bb997695e2e34134927cf690dfda214 WatchSource:0}: Error finding container 8dfd50bba733d9fca2880071d499ed9e1bb997695e2e34134927cf690dfda214: Status 404 returned error can't find the container with id 8dfd50bba733d9fca2880071d499ed9e1bb997695e2e34134927cf690dfda214 Apr 22 18:38:11.151062 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:11.151034 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/1.log" Apr 22 18:38:11.151541 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:11.151400 2572 scope.go:117] "RemoveContainer" containerID="9686d85f36038a015886de9749809411779a863caf30fc4a449546482577e6ec" Apr 22 18:38:11.151611 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:11.151583 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6brbj_openshift-console-operator(60e694f5-420a-4a93-b793-b951e02e4c81)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" podUID="60e694f5-420a-4a93-b793-b951e02e4c81" Apr 22 18:38:11.152127 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:11.152102 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zr7k2" event={"ID":"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a","Type":"ContainerStarted","Data":"8dfd50bba733d9fca2880071d499ed9e1bb997695e2e34134927cf690dfda214"} Apr 22 18:38:12.389699 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:12.389657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:12.390146 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:12.389789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:12.390146 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:12.389827 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:12.390146 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:12.389925 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:20.389904755 +0000 UTC m=+135.249073472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : secret "router-metrics-certs-default" not found Apr 22 18:38:12.390146 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:12.389954 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:20.389935986 +0000 UTC m=+135.249104696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:12.767924 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:12.767847 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hjctd_d1294e5e-31d1-48a2-8134-4d7b0f658d42/dns-node-resolver/0.log" Apr 22 18:38:13.768324 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:13.768295 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tpthn_d8425f70-4f14-4d86-b30e-3abe38269764/node-ca/0.log" Apr 22 18:38:14.406353 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:14.406290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:38:14.406558 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:14.406461 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:38:14.406558 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:14.406553 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs podName:fff77f0b-c2fb-4acb-b894-ce916d7cf9d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:40:16.406524782 +0000 UTC m=+251.265693505 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs") pod "network-metrics-daemon-k7crw" (UID: "fff77f0b-c2fb-4acb-b894-ce916d7cf9d2") : secret "metrics-daemon-secret" not found Apr 22 18:38:14.870797 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:14.870776 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:14.871057 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:14.870811 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:14.871126 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:14.871113 2572 scope.go:117] "RemoveContainer" containerID="9686d85f36038a015886de9749809411779a863caf30fc4a449546482577e6ec" Apr 22 18:38:14.871274 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:14.871258 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6brbj_openshift-console-operator(60e694f5-420a-4a93-b793-b951e02e4c81)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" podUID="60e694f5-420a-4a93-b793-b951e02e4c81" Apr 22 18:38:15.164739 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:15.164698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zr7k2" event={"ID":"c6879dd6-49f5-4be9-a8db-3a753d7b6b9a","Type":"ContainerStarted","Data":"9e5ab8715a867e573a8b76755d41dd486b4f9edb4c86aa5e14727e8cb6ba45e2"} Apr 22 18:38:15.168878 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:15.168856 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k7ljr_89183697-99ab-489f-95ef-9654164feac8/kube-storage-version-migrator-operator/0.log" Apr 22 18:38:15.183761 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:15.183717 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-zr7k2" podStartSLOduration=1.303758052 podStartE2EDuration="5.183701068s" podCreationTimestamp="2026-04-22 18:38:10 +0000 UTC" firstStartedPulling="2026-04-22 18:38:10.951267708 +0000 UTC m=+125.810436405" lastFinishedPulling="2026-04-22 18:38:14.831210723 +0000 UTC m=+129.690379421" observedRunningTime="2026-04-22 18:38:15.182793001 +0000 UTC m=+130.041961719" watchObservedRunningTime="2026-04-22 18:38:15.183701068 +0000 UTC m=+130.042869787" Apr 22 18:38:20.460076 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:20.460031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:20.460577 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:20.460155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:20.460577 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:20.460291 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle podName:1567a865-78f8-433b-a4dd-e7478597180f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:36.460273421 +0000 UTC m=+151.319442121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle") pod "router-default-69c74fc656-2f57x" (UID: "1567a865-78f8-433b-a4dd-e7478597180f") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:20.462428 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:20.462405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1567a865-78f8-433b-a4dd-e7478597180f-metrics-certs\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:26.731318 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:26.731282 2572 scope.go:117] "RemoveContainer" containerID="9686d85f36038a015886de9749809411779a863caf30fc4a449546482577e6ec" Apr 22 18:38:27.193999 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:27.193973 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:38:27.194363 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:27.194345 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/1.log" Apr 22 18:38:27.194470 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:27.194386 2572 generic.go:358] "Generic (PLEG): container finished" podID="60e694f5-420a-4a93-b793-b951e02e4c81" containerID="1a8323db99c83e46ac477b9990893368a423d7bf566ae446908fba858e92c658" exitCode=255 Apr 22 18:38:27.194470 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:27.194430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" event={"ID":"60e694f5-420a-4a93-b793-b951e02e4c81","Type":"ContainerDied","Data":"1a8323db99c83e46ac477b9990893368a423d7bf566ae446908fba858e92c658"} Apr 22 18:38:27.194580 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:27.194471 2572 scope.go:117] "RemoveContainer" containerID="9686d85f36038a015886de9749809411779a863caf30fc4a449546482577e6ec" Apr 22 18:38:27.194785 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:27.194769 2572 scope.go:117] "RemoveContainer" containerID="1a8323db99c83e46ac477b9990893368a423d7bf566ae446908fba858e92c658" Apr 22 18:38:27.194971 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:27.194953 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-6brbj_openshift-console-operator(60e694f5-420a-4a93-b793-b951e02e4c81)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" podUID="60e694f5-420a-4a93-b793-b951e02e4c81" Apr 22 18:38:28.198014 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:28.197988 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:38:34.871573 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:34.871522 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:34.871573 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:34.871576 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:34.872007 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:34.871889 2572 scope.go:117] "RemoveContainer" containerID="1a8323db99c83e46ac477b9990893368a423d7bf566ae446908fba858e92c658" Apr 22 18:38:34.872089 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:34.872032 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-6brbj_openshift-console-operator(60e694f5-420a-4a93-b793-b951e02e4c81)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" podUID="60e694f5-420a-4a93-b793-b951e02e4c81" Apr 22 18:38:35.372168 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.372136 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bkxxm"] Apr 22 18:38:35.374835 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.374809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.379914 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.379884 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:38:35.380043 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.379889 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b54tz\"" Apr 22 18:38:35.380043 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.379974 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:38:35.384701 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.384678 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bkxxm"] Apr 22 18:38:35.467093 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.467057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/38ed131f-ec9b-4691-91ca-1438655083f8-data-volume\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.467265 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.467114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/38ed131f-ec9b-4691-91ca-1438655083f8-crio-socket\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.467265 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.467162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/38ed131f-ec9b-4691-91ca-1438655083f8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.467265 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.467241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/38ed131f-ec9b-4691-91ca-1438655083f8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.467265 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.467264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnhwq\" (UniqueName: \"kubernetes.io/projected/38ed131f-ec9b-4691-91ca-1438655083f8-kube-api-access-xnhwq\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.471168 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.471138 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj"] Apr 22 18:38:35.472999 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.472979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.475377 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.475359 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:38:35.475526 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.475490 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:38:35.475526 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.475496 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fsckm\"" Apr 22 18:38:35.485086 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.485054 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj"] Apr 22 18:38:35.567676 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/38ed131f-ec9b-4691-91ca-1438655083f8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.567882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnhwq\" (UniqueName: \"kubernetes.io/projected/38ed131f-ec9b-4691-91ca-1438655083f8-kube-api-access-xnhwq\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.567882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/38ed131f-ec9b-4691-91ca-1438655083f8-data-volume\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.567882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/38ed131f-ec9b-4691-91ca-1438655083f8-crio-socket\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.567882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/38ed131f-ec9b-4691-91ca-1438655083f8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.567882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/89e53d0a-d054-4c95-b501-048e1450ca72-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fmqhj\" (UID: \"89e53d0a-d054-4c95-b501-048e1450ca72\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.567882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/38ed131f-ec9b-4691-91ca-1438655083f8-crio-socket\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.567882 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.567879 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/89e53d0a-d054-4c95-b501-048e1450ca72-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fmqhj\" (UID: \"89e53d0a-d054-4c95-b501-048e1450ca72\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.568165 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.568144 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/38ed131f-ec9b-4691-91ca-1438655083f8-data-volume\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.568359 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.568324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/38ed131f-ec9b-4691-91ca-1438655083f8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.569959 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.569944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/38ed131f-ec9b-4691-91ca-1438655083f8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.579524 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.579497 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b7b79bf9d-fsxj6"] Apr 22 18:38:35.581412 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.581397 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.584868 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.584850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:38:35.585582 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.585563 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vvpgg\"" Apr 22 18:38:35.585671 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.585611 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:38:35.585738 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.585717 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:38:35.591225 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.591198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnhwq\" (UniqueName: \"kubernetes.io/projected/38ed131f-ec9b-4691-91ca-1438655083f8-kube-api-access-xnhwq\") pod \"insights-runtime-extractor-bkxxm\" (UID: \"38ed131f-ec9b-4691-91ca-1438655083f8\") " pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.594310 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.594292 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:38:35.598199 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.598179 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b7b79bf9d-fsxj6"] Apr 22 18:38:35.668956 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.668856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/89e53d0a-d054-4c95-b501-048e1450ca72-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fmqhj\" (UID: \"89e53d0a-d054-4c95-b501-048e1450ca72\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.668956 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.668904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-registry-tls\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.668956 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.668925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f9f437f-bb3e-48a8-9703-4e84916595f7-registry-certificates\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.669278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.668984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f9f437f-bb3e-48a8-9703-4e84916595f7-ca-trust-extracted\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.669278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.669007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f9f437f-bb3e-48a8-9703-4e84916595f7-trusted-ca\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.669278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.669032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/89e53d0a-d054-4c95-b501-048e1450ca72-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fmqhj\" (UID: \"89e53d0a-d054-4c95-b501-048e1450ca72\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.669278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.669073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f9f437f-bb3e-48a8-9703-4e84916595f7-image-registry-private-configuration\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.669278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.669120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f9f437f-bb3e-48a8-9703-4e84916595f7-installation-pull-secrets\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.669278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.669144 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-bound-sa-token\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.669278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.669172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlwv\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-kube-api-access-7jlwv\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.669622 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.669600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/89e53d0a-d054-4c95-b501-048e1450ca72-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fmqhj\" (UID: \"89e53d0a-d054-4c95-b501-048e1450ca72\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.671134 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.671118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/89e53d0a-d054-4c95-b501-048e1450ca72-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fmqhj\" (UID: \"89e53d0a-d054-4c95-b501-048e1450ca72\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.685196 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.685177 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bkxxm" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f9f437f-bb3e-48a8-9703-4e84916595f7-ca-trust-extracted\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f9f437f-bb3e-48a8-9703-4e84916595f7-trusted-ca\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f9f437f-bb3e-48a8-9703-4e84916595f7-image-registry-private-configuration\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f9f437f-bb3e-48a8-9703-4e84916595f7-installation-pull-secrets\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-bound-sa-token\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlwv\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-kube-api-access-7jlwv\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-registry-tls\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.770901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f9f437f-bb3e-48a8-9703-4e84916595f7-registry-certificates\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.771444 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.770998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f9f437f-bb3e-48a8-9703-4e84916595f7-ca-trust-extracted\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.772115 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.771920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f9f437f-bb3e-48a8-9703-4e84916595f7-trusted-ca\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.772239 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.772150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f9f437f-bb3e-48a8-9703-4e84916595f7-registry-certificates\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.773344 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.773304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f9f437f-bb3e-48a8-9703-4e84916595f7-installation-pull-secrets\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.773496 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.773478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-registry-tls\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.773561 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.773543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f9f437f-bb3e-48a8-9703-4e84916595f7-image-registry-private-configuration\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.778373 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.778355 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-bound-sa-token\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.778513 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.778497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlwv\" (UniqueName: \"kubernetes.io/projected/0f9f437f-bb3e-48a8-9703-4e84916595f7-kube-api-access-7jlwv\") pod \"image-registry-5b7b79bf9d-fsxj6\" (UID: \"0f9f437f-bb3e-48a8-9703-4e84916595f7\") " pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.781285 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.781267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" Apr 22 18:38:35.801195 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.801160 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bkxxm"] Apr 22 18:38:35.804213 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:35.804184 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ed131f_ec9b_4691_91ca_1438655083f8.slice/crio-efd10a9a5aa3b1fac9eeeefd1100ce26cd1c0b10c011de3a1668fb48289b2f71 WatchSource:0}: Error finding container efd10a9a5aa3b1fac9eeeefd1100ce26cd1c0b10c011de3a1668fb48289b2f71: Status 404 returned error can't find the container with id efd10a9a5aa3b1fac9eeeefd1100ce26cd1c0b10c011de3a1668fb48289b2f71 Apr 22 18:38:35.900571 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.900537 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:35.903523 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:35.903492 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj"] Apr 22 18:38:35.907167 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:35.907137 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e53d0a_d054_4c95_b501_048e1450ca72.slice/crio-49e8938d2da6baa93d5aada7e16c6a3fa8225c122e1c988b636450779f44e0ba WatchSource:0}: Error finding container 49e8938d2da6baa93d5aada7e16c6a3fa8225c122e1c988b636450779f44e0ba: Status 404 returned error can't find the container with id 49e8938d2da6baa93d5aada7e16c6a3fa8225c122e1c988b636450779f44e0ba Apr 22 18:38:36.027429 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.027399 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b7b79bf9d-fsxj6"] Apr 22 18:38:36.030342 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:36.030298 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f9f437f_bb3e_48a8_9703_4e84916595f7.slice/crio-e4fc31a91f77e4bcdb0993f73c231e1711cae209678d21e9a824369548ea8876 WatchSource:0}: Error finding container e4fc31a91f77e4bcdb0993f73c231e1711cae209678d21e9a824369548ea8876: Status 404 returned error can't find the container with id e4fc31a91f77e4bcdb0993f73c231e1711cae209678d21e9a824369548ea8876 Apr 22 18:38:36.218787 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.218708 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" event={"ID":"89e53d0a-d054-4c95-b501-048e1450ca72","Type":"ContainerStarted","Data":"49e8938d2da6baa93d5aada7e16c6a3fa8225c122e1c988b636450779f44e0ba"} Apr 22 18:38:36.219866 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.219844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkxxm" event={"ID":"38ed131f-ec9b-4691-91ca-1438655083f8","Type":"ContainerStarted","Data":"dfae8e28011d0facf31ebaae9f95c8bb86f8c156410970bea74349a1260206d0"} Apr 22 18:38:36.219866 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.219869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkxxm" event={"ID":"38ed131f-ec9b-4691-91ca-1438655083f8","Type":"ContainerStarted","Data":"efd10a9a5aa3b1fac9eeeefd1100ce26cd1c0b10c011de3a1668fb48289b2f71"} Apr 22 18:38:36.221088 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.221070 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" event={"ID":"0f9f437f-bb3e-48a8-9703-4e84916595f7","Type":"ContainerStarted","Data":"6a7d6a0b2c3e97c6d807b54ca58f772a9a3a9ecaf2f9952425ecb1ed815fa8a4"} Apr 22 18:38:36.221173 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.221092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" event={"ID":"0f9f437f-bb3e-48a8-9703-4e84916595f7","Type":"ContainerStarted","Data":"e4fc31a91f77e4bcdb0993f73c231e1711cae209678d21e9a824369548ea8876"} Apr 22 18:38:36.221268 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.221253 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:36.242049 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.241999 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" podStartSLOduration=1.241986068 podStartE2EDuration="1.241986068s" podCreationTimestamp="2026-04-22 18:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:36.241575812 +0000 UTC m=+151.100744533" watchObservedRunningTime="2026-04-22 18:38:36.241986068 +0000 UTC m=+151.101154786" Apr 22 18:38:36.476540 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.476450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:36.477092 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.477072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1567a865-78f8-433b-a4dd-e7478597180f-service-ca-bundle\") pod \"router-default-69c74fc656-2f57x\" (UID: \"1567a865-78f8-433b-a4dd-e7478597180f\") " pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:36.773012 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.772931 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jjqxb\"" Apr 22 18:38:36.781126 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.781102 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:36.920100 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:36.920065 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69c74fc656-2f57x"] Apr 22 18:38:36.982592 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:36.982560 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1567a865_78f8_433b_a4dd_e7478597180f.slice/crio-b762f3dc33f1b4af97b193ea76cd8301c9d6585911436293a4e865ae3e57ce41 WatchSource:0}: Error finding container b762f3dc33f1b4af97b193ea76cd8301c9d6585911436293a4e865ae3e57ce41: Status 404 returned error can't find the container with id b762f3dc33f1b4af97b193ea76cd8301c9d6585911436293a4e865ae3e57ce41 Apr 22 18:38:37.226400 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:37.226364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69c74fc656-2f57x" event={"ID":"1567a865-78f8-433b-a4dd-e7478597180f","Type":"ContainerStarted","Data":"b762f3dc33f1b4af97b193ea76cd8301c9d6585911436293a4e865ae3e57ce41"} Apr 22 18:38:38.229784 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:38.229744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69c74fc656-2f57x" event={"ID":"1567a865-78f8-433b-a4dd-e7478597180f","Type":"ContainerStarted","Data":"ec4666aae83108b806102835237d491aac02a8b6905383b9d7549667091e10f7"} Apr 22 18:38:38.230963 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:38.230941 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" event={"ID":"89e53d0a-d054-4c95-b501-048e1450ca72","Type":"ContainerStarted","Data":"ca0ba176d131010a37b011f4f9f88f8750e98d22fb8bca8c2007fc4d33d1dbde"} Apr 22 18:38:38.232371 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:38.232345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkxxm" event={"ID":"38ed131f-ec9b-4691-91ca-1438655083f8","Type":"ContainerStarted","Data":"a05512b0f34a358a8a62d35ff738c99cba11c9c7207115a8dc2daada7a2ac44b"} Apr 22 18:38:38.250675 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:38.250627 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-69c74fc656-2f57x" podStartSLOduration=34.250611035 podStartE2EDuration="34.250611035s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:38.248898054 +0000 UTC m=+153.108066773" watchObservedRunningTime="2026-04-22 18:38:38.250611035 +0000 UTC m=+153.109779753" Apr 22 18:38:38.268205 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:38.268158 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fmqhj" podStartSLOduration=1.808833935 podStartE2EDuration="3.268142852s" podCreationTimestamp="2026-04-22 18:38:35 +0000 UTC" firstStartedPulling="2026-04-22 18:38:35.910004298 +0000 UTC m=+150.769172996" lastFinishedPulling="2026-04-22 18:38:37.369313215 +0000 UTC m=+152.228481913" observedRunningTime="2026-04-22 18:38:38.267865422 +0000 UTC m=+153.127034137" watchObservedRunningTime="2026-04-22 18:38:38.268142852 +0000 UTC m=+153.127311570" Apr 22 18:38:38.781790 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:38.781754 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:38.784218 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:38.784197 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:39.237140 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.237103 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkxxm" event={"ID":"38ed131f-ec9b-4691-91ca-1438655083f8","Type":"ContainerStarted","Data":"c0f1b62936f4745a9b85e8818f9b6f3d4d5c49e0e4477e5f900c42f7a30d8be1"} Apr 22 18:38:39.237633 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.237478 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:39.238688 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.238671 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-69c74fc656-2f57x" Apr 22 18:38:39.256256 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.256208 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bkxxm" podStartSLOduration=1.212955633 podStartE2EDuration="4.256195515s" podCreationTimestamp="2026-04-22 18:38:35 +0000 UTC" firstStartedPulling="2026-04-22 18:38:35.878543428 +0000 UTC m=+150.737712125" lastFinishedPulling="2026-04-22 18:38:38.92178331 +0000 UTC m=+153.780952007" observedRunningTime="2026-04-22 18:38:39.255166682 +0000 UTC m=+154.114335428" watchObservedRunningTime="2026-04-22 18:38:39.256195515 +0000 UTC m=+154.115364233" Apr 22 18:38:39.780671 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.780632 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl"] Apr 22 18:38:39.783616 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.783602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:39.786028 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.785994 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:38:39.786390 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.786368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-jpnlh\"" Apr 22 18:38:39.796062 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.796040 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl"] Apr 22 18:38:39.807417 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.807393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a8da23-e075-4aff-a4a7-44232fb3d61f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cmgvl\" (UID: \"49a8da23-e075-4aff-a4a7-44232fb3d61f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:39.908567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:39.908526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a8da23-e075-4aff-a4a7-44232fb3d61f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cmgvl\" (UID: \"49a8da23-e075-4aff-a4a7-44232fb3d61f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:39.908746 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:39.908660 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:38:39.908746 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:39.908725 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49a8da23-e075-4aff-a4a7-44232fb3d61f-tls-certificates podName:49a8da23-e075-4aff-a4a7-44232fb3d61f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:40.408711069 +0000 UTC m=+155.267879770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/49a8da23-e075-4aff-a4a7-44232fb3d61f-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-cmgvl" (UID: "49a8da23-e075-4aff-a4a7-44232fb3d61f") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:38:40.081958 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:40.081863 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ntjbb" podUID="57975d7c-6756-4dde-9d27-faa3e96cc6f5" Apr 22 18:38:40.104262 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:40.104227 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2phz6" podUID="f0ecf33d-061b-4ba1-9f1e-ec8f458b1027" Apr 22 18:38:40.239443 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:40.239408 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ntjbb" Apr 22 18:38:40.239886 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:40.239518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:38:40.412308 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:40.412270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a8da23-e075-4aff-a4a7-44232fb3d61f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cmgvl\" (UID: \"49a8da23-e075-4aff-a4a7-44232fb3d61f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:40.414583 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:40.414561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a8da23-e075-4aff-a4a7-44232fb3d61f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cmgvl\" (UID: \"49a8da23-e075-4aff-a4a7-44232fb3d61f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:40.691941 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:40.691857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:40.743439 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:40.743407 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-k7crw" podUID="fff77f0b-c2fb-4acb-b894-ce916d7cf9d2" Apr 22 18:38:40.805234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:40.805199 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl"] Apr 22 18:38:40.808228 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:40.808200 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a8da23_e075_4aff_a4a7_44232fb3d61f.slice/crio-02a7763d8165862aa1f7d04596917ad6a271f0e24486ea89a137ddec2912ecd6 WatchSource:0}: Error finding container 02a7763d8165862aa1f7d04596917ad6a271f0e24486ea89a137ddec2912ecd6: Status 404 returned error can't find the container with id 02a7763d8165862aa1f7d04596917ad6a271f0e24486ea89a137ddec2912ecd6 Apr 22 18:38:41.242940 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:41.242893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" event={"ID":"49a8da23-e075-4aff-a4a7-44232fb3d61f","Type":"ContainerStarted","Data":"02a7763d8165862aa1f7d04596917ad6a271f0e24486ea89a137ddec2912ecd6"} Apr 22 18:38:42.246810 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.246778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" event={"ID":"49a8da23-e075-4aff-a4a7-44232fb3d61f","Type":"ContainerStarted","Data":"47da22875ff3862379d3cda83290f9f8c3c85a025a80f9ebda148c969c4d5c65"} Apr 22 18:38:42.247210 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.246966 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:42.251456 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.251434 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" Apr 22 18:38:42.263927 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.263880 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cmgvl" podStartSLOduration=2.290387115 podStartE2EDuration="3.26386946s" podCreationTimestamp="2026-04-22 18:38:39 +0000 UTC" firstStartedPulling="2026-04-22 18:38:40.810048911 +0000 UTC m=+155.669217608" lastFinishedPulling="2026-04-22 18:38:41.783531255 +0000 UTC m=+156.642699953" observedRunningTime="2026-04-22 18:38:42.263264417 +0000 UTC m=+157.122433137" watchObservedRunningTime="2026-04-22 18:38:42.26386946 +0000 UTC m=+157.123038178" Apr 22 18:38:42.851011 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.850977 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-62ghg"] Apr 22 18:38:42.854220 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.854202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:42.857812 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.857790 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:38:42.858366 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.858340 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-2gfz7\"" Apr 22 18:38:42.858476 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.858384 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:38:42.858476 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.858384 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:38:42.858860 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.858842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:38:42.859308 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.859290 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:38:42.865622 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.865597 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-62ghg"] Apr 22 18:38:42.934135 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.934097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:42.934379 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.934159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:42.934379 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.934221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03fd38f5-5a27-45c2-8958-0dae07e467ee-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:42.934379 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:42.934289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljht7\" (UniqueName: \"kubernetes.io/projected/03fd38f5-5a27-45c2-8958-0dae07e467ee-kube-api-access-ljht7\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.035684 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.035648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.035810 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.035713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.035810 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.035735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03fd38f5-5a27-45c2-8958-0dae07e467ee-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.035917 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:43.035833 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 18:38:43.035917 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.035860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljht7\" (UniqueName: \"kubernetes.io/projected/03fd38f5-5a27-45c2-8958-0dae07e467ee-kube-api-access-ljht7\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.035917 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:43.035911 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-tls podName:03fd38f5-5a27-45c2-8958-0dae07e467ee nodeName:}" failed. No retries permitted until 2026-04-22 18:38:43.535887737 +0000 UTC m=+158.395056438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-62ghg" (UID: "03fd38f5-5a27-45c2-8958-0dae07e467ee") : secret "prometheus-operator-tls" not found Apr 22 18:38:43.036482 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.036463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03fd38f5-5a27-45c2-8958-0dae07e467ee-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.038056 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.038033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.043797 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.043775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljht7\" (UniqueName: \"kubernetes.io/projected/03fd38f5-5a27-45c2-8958-0dae07e467ee-kube-api-access-ljht7\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.540773 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.540732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.543127 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.543103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/03fd38f5-5a27-45c2-8958-0dae07e467ee-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-62ghg\" (UID: \"03fd38f5-5a27-45c2-8958-0dae07e467ee\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.764276 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.764236 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" Apr 22 18:38:43.880059 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:43.880018 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-62ghg"] Apr 22 18:38:43.882752 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:43.882723 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03fd38f5_5a27_45c2_8958_0dae07e467ee.slice/crio-0d8add7d600bdd879c831adab459040553e13fca879bdcaba14e443fbbe03e43 WatchSource:0}: Error finding container 0d8add7d600bdd879c831adab459040553e13fca879bdcaba14e443fbbe03e43: Status 404 returned error can't find the container with id 0d8add7d600bdd879c831adab459040553e13fca879bdcaba14e443fbbe03e43 Apr 22 18:38:44.252865 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:44.252825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" event={"ID":"03fd38f5-5a27-45c2-8958-0dae07e467ee","Type":"ContainerStarted","Data":"0d8add7d600bdd879c831adab459040553e13fca879bdcaba14e443fbbe03e43"} Apr 22 18:38:45.053820 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.053785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:38:45.054190 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.053845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:38:45.056103 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.056078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57975d7c-6756-4dde-9d27-faa3e96cc6f5-metrics-tls\") pod \"dns-default-ntjbb\" (UID: \"57975d7c-6756-4dde-9d27-faa3e96cc6f5\") " pod="openshift-dns/dns-default-ntjbb" Apr 22 18:38:45.056223 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.056139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0ecf33d-061b-4ba1-9f1e-ec8f458b1027-cert\") pod \"ingress-canary-2phz6\" (UID: \"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027\") " pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:38:45.258181 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.258142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" event={"ID":"03fd38f5-5a27-45c2-8958-0dae07e467ee","Type":"ContainerStarted","Data":"81616ed93dfab11b14d27fe8e9860369fcd315b50049df0911abbb1cc9b09d96"} Apr 22 18:38:45.258181 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.258187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" event={"ID":"03fd38f5-5a27-45c2-8958-0dae07e467ee","Type":"ContainerStarted","Data":"208cca8c22c82ba1f608fe3e67961b8a6801927e0a0305aacd480196c6101e88"} Apr 22 18:38:45.279347 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.277635 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-62ghg" podStartSLOduration=2.055452364 podStartE2EDuration="3.277615884s" podCreationTimestamp="2026-04-22 18:38:42 +0000 UTC" firstStartedPulling="2026-04-22 18:38:43.884595532 +0000 UTC m=+158.743764229" lastFinishedPulling="2026-04-22 18:38:45.106759049 +0000 UTC m=+159.965927749" observedRunningTime="2026-04-22 18:38:45.275077151 +0000 UTC m=+160.134245870" watchObservedRunningTime="2026-04-22 18:38:45.277615884 +0000 UTC m=+160.136784606" Apr 22 18:38:45.342922 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.342885 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qw5s\"" Apr 22 18:38:45.342922 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.342897 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4wrcz\"" Apr 22 18:38:45.350887 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.350852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ntjbb" Apr 22 18:38:45.351031 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.350983 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2phz6" Apr 22 18:38:45.473568 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.473535 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ntjbb"] Apr 22 18:38:45.477883 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:45.477845 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57975d7c_6756_4dde_9d27_faa3e96cc6f5.slice/crio-e2efec3b38c7adbad881c86fedc10b648318298053000996a1cdc19e79ca49c5 WatchSource:0}: Error finding container e2efec3b38c7adbad881c86fedc10b648318298053000996a1cdc19e79ca49c5: Status 404 returned error can't find the container with id e2efec3b38c7adbad881c86fedc10b648318298053000996a1cdc19e79ca49c5 Apr 22 18:38:45.491284 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:45.491259 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2phz6"] Apr 22 18:38:45.493650 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:45.493625 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ecf33d_061b_4ba1_9f1e_ec8f458b1027.slice/crio-e133bd605ac5f82caf5dd0ab553910d3763979cba9c316033ded2374b93465ce WatchSource:0}: Error finding container e133bd605ac5f82caf5dd0ab553910d3763979cba9c316033ded2374b93465ce: Status 404 returned error can't find the container with id e133bd605ac5f82caf5dd0ab553910d3763979cba9c316033ded2374b93465ce Apr 22 18:38:46.263147 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:46.263095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ntjbb" event={"ID":"57975d7c-6756-4dde-9d27-faa3e96cc6f5","Type":"ContainerStarted","Data":"e2efec3b38c7adbad881c86fedc10b648318298053000996a1cdc19e79ca49c5"} Apr 22 18:38:46.265037 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:46.265007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2phz6" event={"ID":"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027","Type":"ContainerStarted","Data":"e133bd605ac5f82caf5dd0ab553910d3763979cba9c316033ded2374b93465ce"} Apr 22 18:38:47.276866 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.276833 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jt2mg"] Apr 22 18:38:47.280253 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.280232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.283442 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.283354 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:38:47.283608 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.283578 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bc9vl\"" Apr 22 18:38:47.284222 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.284204 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:38:47.284485 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.284472 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.376888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-sys\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-root\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-wtmp\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377091 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-metrics-client-ca\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-tls\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-textfile\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvnkg\" (UniqueName: \"kubernetes.io/projected/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-kube-api-access-wvnkg\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.377567 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.377507 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.477873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.477919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-sys\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.477948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-root\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.477974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-wtmp\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.478000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-metrics-client-ca\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.478056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-tls\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.478087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.478126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-textfile\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.478155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvnkg\" (UniqueName: \"kubernetes.io/projected/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-kube-api-access-wvnkg\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.479011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:47.479132 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:38:47.479179 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-tls podName:b5a914ec-c1b6-4b17-a510-d0ca4c4348f3 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:47.979161806 +0000 UTC m=+162.838330518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-tls") pod "node-exporter-jt2mg" (UID: "b5a914ec-c1b6-4b17-a510-d0ca4c4348f3") : secret "node-exporter-tls" not found Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.479247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-metrics-client-ca\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.479322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-sys\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.479446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-root\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.479573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-wtmp\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.484392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.479861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-textfile\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.485497 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.482762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.489812 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.489774 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvnkg\" (UniqueName: \"kubernetes.io/projected/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-kube-api-access-wvnkg\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.983074 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.982978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-tls\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:47.985254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:47.985224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5a914ec-c1b6-4b17-a510-d0ca4c4348f3-node-exporter-tls\") pod \"node-exporter-jt2mg\" (UID: \"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3\") " pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:48.191745 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.191704 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jt2mg" Apr 22 18:38:48.199898 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:48.199863 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a914ec_c1b6_4b17_a510_d0ca4c4348f3.slice/crio-7ba8fbbd1c3bb0e958ac394d4e2893694fd7cb6054530c55701e7932501b8a6c WatchSource:0}: Error finding container 7ba8fbbd1c3bb0e958ac394d4e2893694fd7cb6054530c55701e7932501b8a6c: Status 404 returned error can't find the container with id 7ba8fbbd1c3bb0e958ac394d4e2893694fd7cb6054530c55701e7932501b8a6c Apr 22 18:38:48.272589 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.272498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jt2mg" event={"ID":"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3","Type":"ContainerStarted","Data":"7ba8fbbd1c3bb0e958ac394d4e2893694fd7cb6054530c55701e7932501b8a6c"} Apr 22 18:38:48.274181 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.274149 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ntjbb" event={"ID":"57975d7c-6756-4dde-9d27-faa3e96cc6f5","Type":"ContainerStarted","Data":"2a10f572dfa5c90a96276e6cf7fbae9d49a975c05c9837238a30f98bb5d19194"} Apr 22 18:38:48.274181 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.274181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ntjbb" event={"ID":"57975d7c-6756-4dde-9d27-faa3e96cc6f5","Type":"ContainerStarted","Data":"d23d79645d056bd9e07342e6d22ce98a9da3e97cc2aa4768fe95a9edd034a760"} Apr 22 18:38:48.274376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.274315 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ntjbb" Apr 22 18:38:48.275419 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.275399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2phz6" event={"ID":"f0ecf33d-061b-4ba1-9f1e-ec8f458b1027","Type":"ContainerStarted","Data":"146598a20bdd0cc9e80bef163808f6ae44d14a5364b030f13e744af0c1389651"} Apr 22 18:38:48.295101 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.295058 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ntjbb" podStartSLOduration=129.465366129 podStartE2EDuration="2m11.295044288s" podCreationTimestamp="2026-04-22 18:36:37 +0000 UTC" firstStartedPulling="2026-04-22 18:38:45.479670392 +0000 UTC m=+160.338839093" lastFinishedPulling="2026-04-22 18:38:47.309348542 +0000 UTC m=+162.168517252" observedRunningTime="2026-04-22 18:38:48.294399969 +0000 UTC m=+163.153568688" watchObservedRunningTime="2026-04-22 18:38:48.295044288 +0000 UTC m=+163.154213006" Apr 22 18:38:48.310737 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:48.310686 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2phz6" podStartSLOduration=129.4904203 podStartE2EDuration="2m11.310671264s" podCreationTimestamp="2026-04-22 18:36:37 +0000 UTC" firstStartedPulling="2026-04-22 18:38:45.495535407 +0000 UTC m=+160.354704104" lastFinishedPulling="2026-04-22 18:38:47.315786368 +0000 UTC m=+162.174955068" observedRunningTime="2026-04-22 18:38:48.30980588 +0000 UTC m=+163.168974596" watchObservedRunningTime="2026-04-22 18:38:48.310671264 +0000 UTC m=+163.169839985" Apr 22 18:38:49.230780 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.230747 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5fcff6d6bc-744gl"] Apr 22 18:38:49.234405 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.234378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.237049 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.237026 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:38:49.237289 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.237270 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-c2j8ujusl47t2\"" Apr 22 18:38:49.237428 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.237270 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:38:49.237650 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.237497 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-9sfzf\"" Apr 22 18:38:49.237650 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.237554 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:38:49.237650 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.237587 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:38:49.238164 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.238145 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:38:49.244622 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.244598 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fcff6d6bc-744gl"] Apr 22 18:38:49.279458 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.279427 2572 generic.go:358] "Generic (PLEG): container finished" podID="b5a914ec-c1b6-4b17-a510-d0ca4c4348f3" containerID="a71f81a1d3e902c51ca7186d2618e4fe5a13054bdf3efb855377b99d3ac9b70c" exitCode=0 Apr 22 18:38:49.279595 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.279514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jt2mg" event={"ID":"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3","Type":"ContainerDied","Data":"a71f81a1d3e902c51ca7186d2618e4fe5a13054bdf3efb855377b99d3ac9b70c"} Apr 22 18:38:49.296694 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.296667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.297112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.296720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-metrics-client-ca\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.297112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.296747 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-grpc-tls\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.297112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.296811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.297112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.296841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5n98\" (UniqueName: \"kubernetes.io/projected/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-kube-api-access-t5n98\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.297112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.296875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.297112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.296935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.297112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.297004 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-tls\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.397787 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.397763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-tls\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.397951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.397826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.397951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.397901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-metrics-client-ca\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.397951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.397930 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-grpc-tls\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.398101 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.397996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.398101 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.398023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5n98\" (UniqueName: \"kubernetes.io/projected/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-kube-api-access-t5n98\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.398101 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.398067 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.398239 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.398215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.399543 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.399510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-metrics-client-ca\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.400773 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.400690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.400901 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.400790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-tls\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.401226 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.401203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.401316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.401271 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.401552 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.401532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.401751 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.401730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-secret-grpc-tls\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.407461 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.407442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5n98\" (UniqueName: \"kubernetes.io/projected/a7d8f602-2ff4-4ef2-9216-162f4e272e2f-kube-api-access-t5n98\") pod \"thanos-querier-5fcff6d6bc-744gl\" (UID: \"a7d8f602-2ff4-4ef2-9216-162f4e272e2f\") " pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.543743 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.543704 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:49.666047 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.666014 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fcff6d6bc-744gl"] Apr 22 18:38:49.669357 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:38:49.669309 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d8f602_2ff4_4ef2_9216_162f4e272e2f.slice/crio-2172260d4c24fa298dd88488a05de950af0909ed9d8ac4bf57d4e3459a3b2379 WatchSource:0}: Error finding container 2172260d4c24fa298dd88488a05de950af0909ed9d8ac4bf57d4e3459a3b2379: Status 404 returned error can't find the container with id 2172260d4c24fa298dd88488a05de950af0909ed9d8ac4bf57d4e3459a3b2379 Apr 22 18:38:49.731181 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:49.731143 2572 scope.go:117] "RemoveContainer" containerID="1a8323db99c83e46ac477b9990893368a423d7bf566ae446908fba858e92c658" Apr 22 18:38:50.284871 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.284837 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jt2mg" event={"ID":"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3","Type":"ContainerStarted","Data":"8bae5974fba578c75e412996f3b2a7b11d7fd93fbea7395654f2b8b5f769d83e"} Apr 22 18:38:50.284871 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.284878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jt2mg" event={"ID":"b5a914ec-c1b6-4b17-a510-d0ca4c4348f3","Type":"ContainerStarted","Data":"7dae4c378bfd62d2489d2784e13f376f1827008acc44a76801e8f5f810fc512a"} Apr 22 18:38:50.285939 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.285914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" event={"ID":"a7d8f602-2ff4-4ef2-9216-162f4e272e2f","Type":"ContainerStarted","Data":"2172260d4c24fa298dd88488a05de950af0909ed9d8ac4bf57d4e3459a3b2379"} Apr 22 18:38:50.287545 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.287528 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:38:50.287638 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.287608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" event={"ID":"60e694f5-420a-4a93-b793-b951e02e4c81","Type":"ContainerStarted","Data":"32484a6ebea0e4aa6ddf81145aff117467debd7880380ce9abab9fa0961f04b8"} Apr 22 18:38:50.287867 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.287850 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:50.292834 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.292813 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" Apr 22 18:38:50.309678 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.309486 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jt2mg" podStartSLOduration=2.4280063419999998 podStartE2EDuration="3.309468903s" podCreationTimestamp="2026-04-22 18:38:47 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.201531836 +0000 UTC m=+163.060700534" lastFinishedPulling="2026-04-22 18:38:49.082994396 +0000 UTC m=+163.942163095" observedRunningTime="2026-04-22 18:38:50.307788612 +0000 UTC m=+165.166957327" watchObservedRunningTime="2026-04-22 18:38:50.309468903 +0000 UTC m=+165.168637624" Apr 22 18:38:50.331498 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:50.331432 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-6brbj" podStartSLOduration=43.23912265 podStartE2EDuration="46.33141084s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="2026-04-22 18:38:05.003378192 +0000 UTC m=+119.862546890" lastFinishedPulling="2026-04-22 18:38:08.095666379 +0000 UTC m=+122.954835080" observedRunningTime="2026-04-22 18:38:50.329536534 +0000 UTC m=+165.188705255" watchObservedRunningTime="2026-04-22 18:38:50.33141084 +0000 UTC m=+165.190579560" Apr 22 18:38:52.296267 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:52.296233 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" event={"ID":"a7d8f602-2ff4-4ef2-9216-162f4e272e2f","Type":"ContainerStarted","Data":"951d4dd58f7e45b451a941ab37cd9bcf7d5c1e457c5f6920b88e71e25000ad0f"} Apr 22 18:38:52.296267 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:52.296269 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" event={"ID":"a7d8f602-2ff4-4ef2-9216-162f4e272e2f","Type":"ContainerStarted","Data":"5d2f786ac137d53c5aaf35903f10e0af191f9b102bcf7751f9490ced78daaf49"} Apr 22 18:38:52.296811 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:52.296279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" event={"ID":"a7d8f602-2ff4-4ef2-9216-162f4e272e2f","Type":"ContainerStarted","Data":"7a8021f80607b0ddaf513ff63c9315285d4c9110e464305a2708505a934bf442"} Apr 22 18:38:52.731028 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:52.730999 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:38:53.302739 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:53.302700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" event={"ID":"a7d8f602-2ff4-4ef2-9216-162f4e272e2f","Type":"ContainerStarted","Data":"c60cb0b7e650af478caee0f548dcb9802c3606c9cdd16194c9db1da6d24cdd64"} Apr 22 18:38:53.302739 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:53.302744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" event={"ID":"a7d8f602-2ff4-4ef2-9216-162f4e272e2f","Type":"ContainerStarted","Data":"e91da594657b55047fa981bc1c495f17643bc9c715c92c9590e3b75fb3cda263"} Apr 22 18:38:53.303230 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:53.302757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" event={"ID":"a7d8f602-2ff4-4ef2-9216-162f4e272e2f","Type":"ContainerStarted","Data":"f6c6d991e1f8e0cb3d7afabbd42db3757c630d04fe34af55b0428fac59e6ec87"} Apr 22 18:38:53.303230 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:53.302864 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:38:53.329185 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:53.329131 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" podStartSLOduration=1.418648112 podStartE2EDuration="4.329117292s" podCreationTimestamp="2026-04-22 18:38:49 +0000 UTC" firstStartedPulling="2026-04-22 18:38:49.671092571 +0000 UTC m=+164.530261267" lastFinishedPulling="2026-04-22 18:38:52.581561749 +0000 UTC m=+167.440730447" observedRunningTime="2026-04-22 18:38:53.32790364 +0000 UTC m=+168.187072360" watchObservedRunningTime="2026-04-22 18:38:53.329117292 +0000 UTC m=+168.188286034" Apr 22 18:38:55.904527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:55.904428 2572 patch_prober.go:28] interesting pod/image-registry-5b7b79bf9d-fsxj6 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:38:55.904527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:55.904501 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" podUID="0f9f437f-bb3e-48a8-9703-4e84916595f7" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:38:57.230223 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:57.230196 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b7b79bf9d-fsxj6" Apr 22 18:38:58.282530 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:58.282499 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ntjbb" Apr 22 18:38:59.311972 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:38:59.311944 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5fcff6d6bc-744gl" Apr 22 18:39:01.190204 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.190170 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-689dbc67d5-g7jth"] Apr 22 18:39:01.193515 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.193499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.196931 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.196907 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:39:01.197037 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.196967 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:39:01.197175 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.197157 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:39:01.197300 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.197206 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hxtps\"" Apr 22 18:39:01.197300 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.197243 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:39:01.197430 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.197294 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:39:01.197430 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.197403 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:39:01.197597 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.197583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:39:01.202300 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.202277 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-689dbc67d5-g7jth"] Apr 22 18:39:01.293076 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.293049 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-config\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.293076 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.293077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spj25\" (UniqueName: \"kubernetes.io/projected/86a74cf2-0185-4f94-9afa-e20d254cdb62-kube-api-access-spj25\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.293254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.293099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-serving-cert\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.293254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.293126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-service-ca\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.293254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.293218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-oauth-config\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.293254 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.293245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-oauth-serving-cert\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.394206 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-oauth-serving-cert\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.394418 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-config\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.394418 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spj25\" (UniqueName: \"kubernetes.io/projected/86a74cf2-0185-4f94-9afa-e20d254cdb62-kube-api-access-spj25\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.394418 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-serving-cert\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.394418 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394319 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-service-ca\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.394418 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394415 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-oauth-config\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.394953 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-oauth-serving-cert\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.395042 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-service-ca\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.395042 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.394990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-config\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.396696 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.396671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-oauth-config\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.396771 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.396684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-serving-cert\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.403502 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.403482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spj25\" (UniqueName: \"kubernetes.io/projected/86a74cf2-0185-4f94-9afa-e20d254cdb62-kube-api-access-spj25\") pod \"console-689dbc67d5-g7jth\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.503536 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.503488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:01.635512 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:01.635488 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-689dbc67d5-g7jth"] Apr 22 18:39:01.637776 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:39:01.637747 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a74cf2_0185_4f94_9afa_e20d254cdb62.slice/crio-010ddd0601d3d3092c2bba0ee31dfd8ae9fc085b4e6ffefcafa996febee728c0 WatchSource:0}: Error finding container 010ddd0601d3d3092c2bba0ee31dfd8ae9fc085b4e6ffefcafa996febee728c0: Status 404 returned error can't find the container with id 010ddd0601d3d3092c2bba0ee31dfd8ae9fc085b4e6ffefcafa996febee728c0 Apr 22 18:39:02.326870 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:02.326835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689dbc67d5-g7jth" event={"ID":"86a74cf2-0185-4f94-9afa-e20d254cdb62","Type":"ContainerStarted","Data":"010ddd0601d3d3092c2bba0ee31dfd8ae9fc085b4e6ffefcafa996febee728c0"} Apr 22 18:39:05.336859 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:05.336819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689dbc67d5-g7jth" event={"ID":"86a74cf2-0185-4f94-9afa-e20d254cdb62","Type":"ContainerStarted","Data":"71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6"} Apr 22 18:39:05.355535 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:05.355490 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-689dbc67d5-g7jth" podStartSLOduration=1.400416036 podStartE2EDuration="4.355476147s" podCreationTimestamp="2026-04-22 18:39:01 +0000 UTC" firstStartedPulling="2026-04-22 18:39:01.639720613 +0000 UTC m=+176.498889310" lastFinishedPulling="2026-04-22 18:39:04.594780721 +0000 UTC m=+179.453949421" observedRunningTime="2026-04-22 18:39:05.354836398 +0000 UTC m=+180.214005115" watchObservedRunningTime="2026-04-22 18:39:05.355476147 +0000 UTC m=+180.214644865" Apr 22 18:39:11.503792 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:11.503755 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:11.503792 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:11.503799 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:11.508849 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:11.508827 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:12.167313 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.167282 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5988d77944-k95p8"] Apr 22 18:39:12.172458 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.172433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.180549 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.180508 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:39:12.180772 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.180747 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5988d77944-k95p8"] Apr 22 18:39:12.285191 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.285168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-service-ca\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.285316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.285204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhwz\" (UniqueName: \"kubernetes.io/projected/5017fed1-bd62-4980-bb06-9a5a408f5cac-kube-api-access-dkhwz\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.285316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.285223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-config\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.285316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.285241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-oauth-serving-cert\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.285316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.285258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-serving-cert\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.285316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.285278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-trusted-ca-bundle\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.285316 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.285296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-oauth-config\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.363362 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.363320 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:12.386137 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-oauth-config\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386240 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-service-ca\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386240 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhwz\" (UniqueName: \"kubernetes.io/projected/5017fed1-bd62-4980-bb06-9a5a408f5cac-kube-api-access-dkhwz\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386240 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-config\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386240 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386234 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-oauth-serving-cert\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386449 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-serving-cert\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386449 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-trusted-ca-bundle\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386841 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-service-ca\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.386971 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.386945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-oauth-serving-cert\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.387063 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.387036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-config\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.387200 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.387179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-trusted-ca-bundle\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.388623 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.388604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-oauth-config\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.388698 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.388683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-serving-cert\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.406696 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.406667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhwz\" (UniqueName: \"kubernetes.io/projected/5017fed1-bd62-4980-bb06-9a5a408f5cac-kube-api-access-dkhwz\") pod \"console-5988d77944-k95p8\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.482766 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.482695 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:12.596152 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:12.596129 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5988d77944-k95p8"] Apr 22 18:39:12.598237 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:39:12.598212 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5017fed1_bd62_4980_bb06_9a5a408f5cac.slice/crio-c7962f8ac4396f263122b3978ecf3c38f9a72d7825328fefc247c3ad6f4002eb WatchSource:0}: Error finding container c7962f8ac4396f263122b3978ecf3c38f9a72d7825328fefc247c3ad6f4002eb: Status 404 returned error can't find the container with id c7962f8ac4396f263122b3978ecf3c38f9a72d7825328fefc247c3ad6f4002eb Apr 22 18:39:13.362234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:13.362205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5988d77944-k95p8" event={"ID":"5017fed1-bd62-4980-bb06-9a5a408f5cac","Type":"ContainerStarted","Data":"0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc"} Apr 22 18:39:13.362234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:13.362236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5988d77944-k95p8" event={"ID":"5017fed1-bd62-4980-bb06-9a5a408f5cac","Type":"ContainerStarted","Data":"c7962f8ac4396f263122b3978ecf3c38f9a72d7825328fefc247c3ad6f4002eb"} Apr 22 18:39:13.380802 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:13.380759 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5988d77944-k95p8" podStartSLOduration=1.380742645 podStartE2EDuration="1.380742645s" podCreationTimestamp="2026-04-22 18:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:13.380743376 +0000 UTC m=+188.239912098" watchObservedRunningTime="2026-04-22 18:39:13.380742645 +0000 UTC m=+188.239911364" Apr 22 18:39:22.483376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:22.483345 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:22.483376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:22.483385 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:22.487734 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:22.487710 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:23.392938 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:23.392908 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:39:23.444859 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:23.444828 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-689dbc67d5-g7jth"] Apr 22 18:39:24.393212 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:24.393172 2572 generic.go:358] "Generic (PLEG): container finished" podID="89183697-99ab-489f-95ef-9654164feac8" containerID="b195bd4cac6e970ce477b13bddcb794935fb3557288608b2b3041cba3573e04b" exitCode=0 Apr 22 18:39:24.393691 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:24.393243 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" event={"ID":"89183697-99ab-489f-95ef-9654164feac8","Type":"ContainerDied","Data":"b195bd4cac6e970ce477b13bddcb794935fb3557288608b2b3041cba3573e04b"} Apr 22 18:39:24.393691 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:24.393679 2572 scope.go:117] "RemoveContainer" containerID="b195bd4cac6e970ce477b13bddcb794935fb3557288608b2b3041cba3573e04b" Apr 22 18:39:25.397382 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:25.397349 2572 generic.go:358] "Generic (PLEG): container finished" podID="e949ed90-0ea2-43e9-8cbc-ae1bec9390c9" containerID="3dbdc280d87418e46bf1d5e462c5aaa2a406e5ba640ee02d8abe00a7b159b402" exitCode=0 Apr 22 18:39:25.397797 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:25.397358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gs4mb" event={"ID":"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9","Type":"ContainerDied","Data":"3dbdc280d87418e46bf1d5e462c5aaa2a406e5ba640ee02d8abe00a7b159b402"} Apr 22 18:39:25.397797 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:25.397762 2572 scope.go:117] "RemoveContainer" containerID="3dbdc280d87418e46bf1d5e462c5aaa2a406e5ba640ee02d8abe00a7b159b402" Apr 22 18:39:25.399001 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:25.398978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k7ljr" event={"ID":"89183697-99ab-489f-95ef-9654164feac8","Type":"ContainerStarted","Data":"fc3c5c7c85dd5be79aa6a04ed50f6247f53649127b4a1e8b90dd955d196259df"} Apr 22 18:39:26.403740 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:26.403700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gs4mb" event={"ID":"e949ed90-0ea2-43e9-8cbc-ae1bec9390c9","Type":"ContainerStarted","Data":"a9f9e9f6dde5314d9b4c353b7b8ea4dabbdc988c68adc35a42f1f72830003d28"} Apr 22 18:39:30.640508 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:30.640478 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jt2mg_b5a914ec-c1b6-4b17-a510-d0ca4c4348f3/init-textfile/0.log" Apr 22 18:39:30.840841 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:30.840814 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jt2mg_b5a914ec-c1b6-4b17-a510-d0ca4c4348f3/node-exporter/0.log" Apr 22 18:39:31.041050 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:31.040983 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jt2mg_b5a914ec-c1b6-4b17-a510-d0ca4c4348f3/kube-rbac-proxy/0.log" Apr 22 18:39:33.841836 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:33.841801 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-62ghg_03fd38f5-5a27-45c2-8958-0dae07e467ee/prometheus-operator/0.log" Apr 22 18:39:34.040574 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:34.040545 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-62ghg_03fd38f5-5a27-45c2-8958-0dae07e467ee/kube-rbac-proxy/0.log" Apr 22 18:39:34.240350 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:34.240255 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-cmgvl_49a8da23-e075-4aff-a4a7-44232fb3d61f/prometheus-operator-admission-webhook/0.log" Apr 22 18:39:34.440753 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:34.440719 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/thanos-query/0.log" Apr 22 18:39:34.640380 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:34.640353 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy-web/0.log" Apr 22 18:39:34.839287 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:34.839262 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy/0.log" Apr 22 18:39:35.040246 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:35.040166 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/prom-label-proxy/0.log" Apr 22 18:39:35.240723 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:35.240696 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy-rules/0.log" Apr 22 18:39:35.441182 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:35.441156 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy-metrics/0.log" Apr 22 18:39:35.640538 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:35.640505 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fmqhj_89e53d0a-d054-4c95-b501-048e1450ca72/networking-console-plugin/0.log" Apr 22 18:39:35.840703 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:35.840679 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:39:36.041905 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:36.041872 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/3.log" Apr 22 18:39:36.240425 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:36.240349 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5988d77944-k95p8_5017fed1-bd62-4980-bb06-9a5a408f5cac/console/0.log" Apr 22 18:39:36.440822 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:36.440795 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-689dbc67d5-g7jth_86a74cf2-0185-4f94-9afa-e20d254cdb62/console/0.log" Apr 22 18:39:36.841300 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:36.841270 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69c74fc656-2f57x_1567a865-78f8-433b-a4dd-e7478597180f/router/0.log" Apr 22 18:39:37.041020 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:37.040989 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2phz6_f0ecf33d-061b-4ba1-9f1e-ec8f458b1027/serve-healthcheck-canary/0.log" Apr 22 18:39:48.464378 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.464316 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-689dbc67d5-g7jth" podUID="86a74cf2-0185-4f94-9afa-e20d254cdb62" containerName="console" containerID="cri-o://71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6" gracePeriod=15 Apr 22 18:39:48.694010 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.693981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-689dbc67d5-g7jth_86a74cf2-0185-4f94-9afa-e20d254cdb62/console/0.log" Apr 22 18:39:48.694127 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.694050 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:48.775165 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775076 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-config\") pod \"86a74cf2-0185-4f94-9afa-e20d254cdb62\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " Apr 22 18:39:48.775165 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775136 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spj25\" (UniqueName: \"kubernetes.io/projected/86a74cf2-0185-4f94-9afa-e20d254cdb62-kube-api-access-spj25\") pod \"86a74cf2-0185-4f94-9afa-e20d254cdb62\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " Apr 22 18:39:48.775429 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775224 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-serving-cert\") pod \"86a74cf2-0185-4f94-9afa-e20d254cdb62\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " Apr 22 18:39:48.775429 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775255 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-service-ca\") pod \"86a74cf2-0185-4f94-9afa-e20d254cdb62\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " Apr 22 18:39:48.775429 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775272 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-oauth-config\") pod \"86a74cf2-0185-4f94-9afa-e20d254cdb62\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " Apr 22 18:39:48.775429 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775305 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-oauth-serving-cert\") pod \"86a74cf2-0185-4f94-9afa-e20d254cdb62\" (UID: \"86a74cf2-0185-4f94-9afa-e20d254cdb62\") " Apr 22 18:39:48.775629 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775531 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-config" (OuterVolumeSpecName: "console-config") pod "86a74cf2-0185-4f94-9afa-e20d254cdb62" (UID: "86a74cf2-0185-4f94-9afa-e20d254cdb62"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:48.775629 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775607 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-service-ca" (OuterVolumeSpecName: "service-ca") pod "86a74cf2-0185-4f94-9afa-e20d254cdb62" (UID: "86a74cf2-0185-4f94-9afa-e20d254cdb62"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:48.775800 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.775767 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "86a74cf2-0185-4f94-9afa-e20d254cdb62" (UID: "86a74cf2-0185-4f94-9afa-e20d254cdb62"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:48.777470 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.777442 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "86a74cf2-0185-4f94-9afa-e20d254cdb62" (UID: "86a74cf2-0185-4f94-9afa-e20d254cdb62"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:48.777470 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.777458 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "86a74cf2-0185-4f94-9afa-e20d254cdb62" (UID: "86a74cf2-0185-4f94-9afa-e20d254cdb62"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:48.777595 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.777500 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a74cf2-0185-4f94-9afa-e20d254cdb62-kube-api-access-spj25" (OuterVolumeSpecName: "kube-api-access-spj25") pod "86a74cf2-0185-4f94-9afa-e20d254cdb62" (UID: "86a74cf2-0185-4f94-9afa-e20d254cdb62"). InnerVolumeSpecName "kube-api-access-spj25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:48.876238 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.876198 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spj25\" (UniqueName: \"kubernetes.io/projected/86a74cf2-0185-4f94-9afa-e20d254cdb62-kube-api-access-spj25\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:39:48.876238 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.876232 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:39:48.876238 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.876242 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-service-ca\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:39:48.876513 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.876252 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-oauth-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:39:48.876513 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.876262 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-oauth-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:39:48.876513 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:48.876271 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86a74cf2-0185-4f94-9afa-e20d254cdb62-console-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:39:49.470154 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.470127 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-689dbc67d5-g7jth_86a74cf2-0185-4f94-9afa-e20d254cdb62/console/0.log" Apr 22 18:39:49.470647 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.470166 2572 generic.go:358] "Generic (PLEG): container finished" podID="86a74cf2-0185-4f94-9afa-e20d254cdb62" containerID="71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6" exitCode=2 Apr 22 18:39:49.470647 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.470249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689dbc67d5-g7jth" event={"ID":"86a74cf2-0185-4f94-9afa-e20d254cdb62","Type":"ContainerDied","Data":"71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6"} Apr 22 18:39:49.470647 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.470279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689dbc67d5-g7jth" event={"ID":"86a74cf2-0185-4f94-9afa-e20d254cdb62","Type":"ContainerDied","Data":"010ddd0601d3d3092c2bba0ee31dfd8ae9fc085b4e6ffefcafa996febee728c0"} Apr 22 18:39:49.470647 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.470281 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689dbc67d5-g7jth" Apr 22 18:39:49.470647 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.470295 2572 scope.go:117] "RemoveContainer" containerID="71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6" Apr 22 18:39:49.480006 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.479988 2572 scope.go:117] "RemoveContainer" containerID="71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6" Apr 22 18:39:49.480221 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:39:49.480203 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6\": container with ID starting with 71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6 not found: ID does not exist" containerID="71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6" Apr 22 18:39:49.480276 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.480228 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6"} err="failed to get container status \"71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6\": rpc error: code = NotFound desc = could not find container \"71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6\": container with ID starting with 71b87a330a231dfdfb76f479f80de91214700856eb20a6fb540c7ecc54d879b6 not found: ID does not exist" Apr 22 18:39:49.491346 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.491308 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-689dbc67d5-g7jth"] Apr 22 18:39:49.495436 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.495415 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-689dbc67d5-g7jth"] Apr 22 18:39:49.735278 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:39:49.735202 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86a74cf2-0185-4f94-9afa-e20d254cdb62" path="/var/lib/kubelet/pods/86a74cf2-0185-4f94-9afa-e20d254cdb62/volumes" Apr 22 18:40:15.871346 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:15.871287 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65d8bf7b8-8xqkf"] Apr 22 18:40:15.871839 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:15.871718 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86a74cf2-0185-4f94-9afa-e20d254cdb62" containerName="console" Apr 22 18:40:15.871839 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:15.871736 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a74cf2-0185-4f94-9afa-e20d254cdb62" containerName="console" Apr 22 18:40:15.871839 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:15.871809 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="86a74cf2-0185-4f94-9afa-e20d254cdb62" containerName="console" Apr 22 18:40:15.874129 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:15.874110 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:15.885707 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:15.885674 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d8bf7b8-8xqkf"] Apr 22 18:40:16.007077 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.007051 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-serving-cert\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.007195 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.007083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-oauth-serving-cert\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.007195 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.007107 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-oauth-config\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.007272 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.007194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n5fw\" (UniqueName: \"kubernetes.io/projected/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-kube-api-access-9n5fw\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.007272 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.007231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-service-ca\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.007358 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.007277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-trusted-ca-bundle\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.007358 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.007323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-config\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.108532 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.108501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-trusted-ca-bundle\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.108657 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.108547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-config\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.108697 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.108666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-serving-cert\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.108731 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.108696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-oauth-serving-cert\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.108769 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.108728 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-oauth-config\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.108823 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.108770 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n5fw\" (UniqueName: \"kubernetes.io/projected/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-kube-api-access-9n5fw\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.108823 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.108802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-service-ca\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.109251 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.109224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-config\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.109411 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.109392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-oauth-serving-cert\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.109466 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.109417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-service-ca\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.109466 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.109431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-trusted-ca-bundle\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.111050 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.111022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-oauth-config\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.111155 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.111135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-serving-cert\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.117741 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.117720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n5fw\" (UniqueName: \"kubernetes.io/projected/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-kube-api-access-9n5fw\") pod \"console-65d8bf7b8-8xqkf\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.183421 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.183358 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:16.307273 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.307248 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d8bf7b8-8xqkf"] Apr 22 18:40:16.309048 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:40:16.309019 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa43ad2_9a88_4f96_a0eb_aa413f3e7ad7.slice/crio-e44ce6f2f6254d9c63e8072d70a6368986830d49f75fc210713df8ee6a402f04 WatchSource:0}: Error finding container e44ce6f2f6254d9c63e8072d70a6368986830d49f75fc210713df8ee6a402f04: Status 404 returned error can't find the container with id e44ce6f2f6254d9c63e8072d70a6368986830d49f75fc210713df8ee6a402f04 Apr 22 18:40:16.411290 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.411251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:40:16.414132 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.414107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fff77f0b-c2fb-4acb-b894-ce916d7cf9d2-metrics-certs\") pod \"network-metrics-daemon-k7crw\" (UID: \"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2\") " pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:40:16.434637 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.434573 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfh6w\"" Apr 22 18:40:16.442720 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.442697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7crw" Apr 22 18:40:16.547596 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.547565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d8bf7b8-8xqkf" event={"ID":"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7","Type":"ContainerStarted","Data":"7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf"} Apr 22 18:40:16.547596 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.547598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d8bf7b8-8xqkf" event={"ID":"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7","Type":"ContainerStarted","Data":"e44ce6f2f6254d9c63e8072d70a6368986830d49f75fc210713df8ee6a402f04"} Apr 22 18:40:16.578622 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.578596 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k7crw"] Apr 22 18:40:16.581628 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:40:16.581605 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff77f0b_c2fb_4acb_b894_ce916d7cf9d2.slice/crio-98704ab59e8bb1c76072a53276d101532205e256e8583924013c7bc127a027b8 WatchSource:0}: Error finding container 98704ab59e8bb1c76072a53276d101532205e256e8583924013c7bc127a027b8: Status 404 returned error can't find the container with id 98704ab59e8bb1c76072a53276d101532205e256e8583924013c7bc127a027b8 Apr 22 18:40:16.584575 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:16.584536 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65d8bf7b8-8xqkf" podStartSLOduration=1.584522982 podStartE2EDuration="1.584522982s" podCreationTimestamp="2026-04-22 18:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:40:16.583147571 +0000 UTC m=+251.442316290" watchObservedRunningTime="2026-04-22 18:40:16.584522982 +0000 UTC m=+251.443691701" Apr 22 18:40:17.553348 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:17.553298 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7crw" event={"ID":"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2","Type":"ContainerStarted","Data":"22afb6d42d501698fb0304ec9f19ad543fa91a187833ed80143602fbdcfa1e44"} Apr 22 18:40:17.553690 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:17.553359 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7crw" event={"ID":"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2","Type":"ContainerStarted","Data":"98704ab59e8bb1c76072a53276d101532205e256e8583924013c7bc127a027b8"} Apr 22 18:40:18.557392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:18.557358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7crw" event={"ID":"fff77f0b-c2fb-4acb-b894-ce916d7cf9d2","Type":"ContainerStarted","Data":"e2721e285f511bcd528f41bd1204b80aec9af87e958d5c0a75ab23964e28053c"} Apr 22 18:40:18.575561 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:18.575512 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k7crw" podStartSLOduration=252.735656771 podStartE2EDuration="4m13.575495352s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:40:16.583285952 +0000 UTC m=+251.442454650" lastFinishedPulling="2026-04-22 18:40:17.423124534 +0000 UTC m=+252.282293231" observedRunningTime="2026-04-22 18:40:18.575023355 +0000 UTC m=+253.434192074" watchObservedRunningTime="2026-04-22 18:40:18.575495352 +0000 UTC m=+253.434664071" Apr 22 18:40:26.184069 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:26.183989 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:26.184527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:26.184087 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:26.188321 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:26.188300 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:26.585177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:26.585117 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:40:26.635538 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:26.635510 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5988d77944-k95p8"] Apr 22 18:40:51.660016 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.659964 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5988d77944-k95p8" podUID="5017fed1-bd62-4980-bb06-9a5a408f5cac" containerName="console" containerID="cri-o://0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc" gracePeriod=15 Apr 22 18:40:51.913904 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.913853 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5988d77944-k95p8_5017fed1-bd62-4980-bb06-9a5a408f5cac/console/0.log" Apr 22 18:40:51.914001 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.913911 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:40:51.961848 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.961823 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-oauth-serving-cert\") pod \"5017fed1-bd62-4980-bb06-9a5a408f5cac\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " Apr 22 18:40:51.961961 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.961854 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-oauth-config\") pod \"5017fed1-bd62-4980-bb06-9a5a408f5cac\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " Apr 22 18:40:51.961961 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.961873 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-serving-cert\") pod \"5017fed1-bd62-4980-bb06-9a5a408f5cac\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " Apr 22 18:40:51.961961 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.961921 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-config\") pod \"5017fed1-bd62-4980-bb06-9a5a408f5cac\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " Apr 22 18:40:51.962077 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.961976 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-service-ca\") pod \"5017fed1-bd62-4980-bb06-9a5a408f5cac\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " Apr 22 18:40:51.962077 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.962011 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkhwz\" (UniqueName: \"kubernetes.io/projected/5017fed1-bd62-4980-bb06-9a5a408f5cac-kube-api-access-dkhwz\") pod \"5017fed1-bd62-4980-bb06-9a5a408f5cac\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " Apr 22 18:40:51.962077 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.962055 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-trusted-ca-bundle\") pod \"5017fed1-bd62-4980-bb06-9a5a408f5cac\" (UID: \"5017fed1-bd62-4980-bb06-9a5a408f5cac\") " Apr 22 18:40:51.962494 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.962184 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5017fed1-bd62-4980-bb06-9a5a408f5cac" (UID: "5017fed1-bd62-4980-bb06-9a5a408f5cac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:51.962494 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.962371 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-service-ca" (OuterVolumeSpecName: "service-ca") pod "5017fed1-bd62-4980-bb06-9a5a408f5cac" (UID: "5017fed1-bd62-4980-bb06-9a5a408f5cac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:51.962494 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.962394 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-config" (OuterVolumeSpecName: "console-config") pod "5017fed1-bd62-4980-bb06-9a5a408f5cac" (UID: "5017fed1-bd62-4980-bb06-9a5a408f5cac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:51.962691 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.962607 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5017fed1-bd62-4980-bb06-9a5a408f5cac" (UID: "5017fed1-bd62-4980-bb06-9a5a408f5cac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:51.963809 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.963783 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5017fed1-bd62-4980-bb06-9a5a408f5cac" (UID: "5017fed1-bd62-4980-bb06-9a5a408f5cac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.964021 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.964000 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5017fed1-bd62-4980-bb06-9a5a408f5cac" (UID: "5017fed1-bd62-4980-bb06-9a5a408f5cac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.964089 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:51.964002 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5017fed1-bd62-4980-bb06-9a5a408f5cac-kube-api-access-dkhwz" (OuterVolumeSpecName: "kube-api-access-dkhwz") pod "5017fed1-bd62-4980-bb06-9a5a408f5cac" (UID: "5017fed1-bd62-4980-bb06-9a5a408f5cac"). InnerVolumeSpecName "kube-api-access-dkhwz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:52.062835 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.062815 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dkhwz\" (UniqueName: \"kubernetes.io/projected/5017fed1-bd62-4980-bb06-9a5a408f5cac-kube-api-access-dkhwz\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.062835 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.062835 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-trusted-ca-bundle\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.062951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.062845 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-oauth-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.062951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.062854 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-oauth-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.062951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.062863 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.062951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.062871 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-console-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.062951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.062880 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5017fed1-bd62-4980-bb06-9a5a408f5cac-service-ca\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.650044 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.650023 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5988d77944-k95p8_5017fed1-bd62-4980-bb06-9a5a408f5cac/console/0.log" Apr 22 18:40:52.650180 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.650059 2572 generic.go:358] "Generic (PLEG): container finished" podID="5017fed1-bd62-4980-bb06-9a5a408f5cac" containerID="0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc" exitCode=2 Apr 22 18:40:52.650180 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.650100 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5988d77944-k95p8" event={"ID":"5017fed1-bd62-4980-bb06-9a5a408f5cac","Type":"ContainerDied","Data":"0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc"} Apr 22 18:40:52.650180 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.650121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5988d77944-k95p8" event={"ID":"5017fed1-bd62-4980-bb06-9a5a408f5cac","Type":"ContainerDied","Data":"c7962f8ac4396f263122b3978ecf3c38f9a72d7825328fefc247c3ad6f4002eb"} Apr 22 18:40:52.650180 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.650119 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5988d77944-k95p8" Apr 22 18:40:52.650312 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.650192 2572 scope.go:117] "RemoveContainer" containerID="0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc" Apr 22 18:40:52.658340 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.658314 2572 scope.go:117] "RemoveContainer" containerID="0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc" Apr 22 18:40:52.658585 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:40:52.658563 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc\": container with ID starting with 0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc not found: ID does not exist" containerID="0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc" Apr 22 18:40:52.658677 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.658589 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc"} err="failed to get container status \"0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc\": rpc error: code = NotFound desc = could not find container \"0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc\": container with ID starting with 0115818a3d8034279c06b776d4915bb0d86642dfe44ecc7527cff381291e77fc not found: ID does not exist" Apr 22 18:40:52.671033 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.671011 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5988d77944-k95p8"] Apr 22 18:40:52.675112 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:52.675092 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5988d77944-k95p8"] Apr 22 18:40:53.735052 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:40:53.735019 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5017fed1-bd62-4980-bb06-9a5a408f5cac" path="/var/lib/kubelet/pods/5017fed1-bd62-4980-bb06-9a5a408f5cac/volumes" Apr 22 18:41:05.596645 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:05.596618 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:41:05.597065 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:05.596716 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:41:05.600640 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:05.600615 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:41:05.600941 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:05.600921 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:41:05.606032 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:05.606017 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:41:24.899478 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.899445 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cff5ccc4-mnt5j"] Apr 22 18:41:24.901945 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.900315 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5017fed1-bd62-4980-bb06-9a5a408f5cac" containerName="console" Apr 22 18:41:24.901945 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.900364 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5017fed1-bd62-4980-bb06-9a5a408f5cac" containerName="console" Apr 22 18:41:24.901945 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.900453 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5017fed1-bd62-4980-bb06-9a5a408f5cac" containerName="console" Apr 22 18:41:24.906054 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.906031 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:24.912672 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.912649 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cff5ccc4-mnt5j"] Apr 22 18:41:24.986279 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.986254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-config\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:24.986440 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.986297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdbd\" (UniqueName: \"kubernetes.io/projected/08ddac11-7a61-46c9-bd54-bbc43caba02f-kube-api-access-7pdbd\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:24.986440 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.986373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-trusted-ca-bundle\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:24.986440 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.986424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-oauth-config\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:24.986551 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.986462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-oauth-serving-cert\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:24.986551 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.986483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-serving-cert\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:24.986551 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:24.986517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-service-ca\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.087837 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.087805 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdbd\" (UniqueName: \"kubernetes.io/projected/08ddac11-7a61-46c9-bd54-bbc43caba02f-kube-api-access-7pdbd\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.087987 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.087858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-trusted-ca-bundle\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.087987 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.087883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-oauth-config\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.087987 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.087944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-oauth-serving-cert\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.088102 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.088016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-serving-cert\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.088102 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.088035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-service-ca\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.088102 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.088084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-config\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.088716 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.088684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-oauth-serving-cert\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.088863 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.088703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-config\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.088863 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.088797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-service-ca\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.088977 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.088943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-trusted-ca-bundle\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.090525 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.090497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-oauth-config\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.090693 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.090674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-serving-cert\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.095319 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.095301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdbd\" (UniqueName: \"kubernetes.io/projected/08ddac11-7a61-46c9-bd54-bbc43caba02f-kube-api-access-7pdbd\") pod \"console-cff5ccc4-mnt5j\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.217125 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.217058 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:25.339220 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.339197 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cff5ccc4-mnt5j"] Apr 22 18:41:25.341497 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:41:25.341471 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08ddac11_7a61_46c9_bd54_bbc43caba02f.slice/crio-50dd3fa7bc98ff65bac3935412ca6f3aa41b132a83f1a09fd95364515554be5e WatchSource:0}: Error finding container 50dd3fa7bc98ff65bac3935412ca6f3aa41b132a83f1a09fd95364515554be5e: Status 404 returned error can't find the container with id 50dd3fa7bc98ff65bac3935412ca6f3aa41b132a83f1a09fd95364515554be5e Apr 22 18:41:25.343277 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.343259 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:41:25.735925 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.735894 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cff5ccc4-mnt5j" event={"ID":"08ddac11-7a61-46c9-bd54-bbc43caba02f","Type":"ContainerStarted","Data":"375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6"} Apr 22 18:41:25.735925 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.735926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cff5ccc4-mnt5j" event={"ID":"08ddac11-7a61-46c9-bd54-bbc43caba02f","Type":"ContainerStarted","Data":"50dd3fa7bc98ff65bac3935412ca6f3aa41b132a83f1a09fd95364515554be5e"} Apr 22 18:41:25.767221 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:25.767172 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cff5ccc4-mnt5j" podStartSLOduration=1.767156515 podStartE2EDuration="1.767156515s" podCreationTimestamp="2026-04-22 18:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:41:25.766097827 +0000 UTC m=+320.625266544" watchObservedRunningTime="2026-04-22 18:41:25.767156515 +0000 UTC m=+320.626325234" Apr 22 18:41:35.218026 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:35.217973 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:35.218026 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:35.218034 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:35.223435 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:35.223409 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:35.765652 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:35.765624 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:41:35.813756 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:35.813721 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d8bf7b8-8xqkf"] Apr 22 18:41:59.496345 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.496291 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hj4cl"] Apr 22 18:41:59.499544 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.499524 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.502608 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.502589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:41:59.508784 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.508763 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hj4cl"] Apr 22 18:41:59.551495 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.551472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/977893ce-dc9d-42ee-9339-051be82076b8-kubelet-config\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.551579 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.551517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/977893ce-dc9d-42ee-9339-051be82076b8-original-pull-secret\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.551637 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.551617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/977893ce-dc9d-42ee-9339-051be82076b8-dbus\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.652401 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.652378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/977893ce-dc9d-42ee-9339-051be82076b8-kubelet-config\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.652519 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.652416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/977893ce-dc9d-42ee-9339-051be82076b8-original-pull-secret\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.652519 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.652495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/977893ce-dc9d-42ee-9339-051be82076b8-kubelet-config\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.652619 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.652544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/977893ce-dc9d-42ee-9339-051be82076b8-dbus\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.652706 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.652690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/977893ce-dc9d-42ee-9339-051be82076b8-dbus\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.654519 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.654495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/977893ce-dc9d-42ee-9339-051be82076b8-original-pull-secret\") pod \"global-pull-secret-syncer-hj4cl\" (UID: \"977893ce-dc9d-42ee-9339-051be82076b8\") " pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.808995 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.808944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hj4cl" Apr 22 18:41:59.923162 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:41:59.923132 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hj4cl"] Apr 22 18:41:59.925963 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:41:59.925932 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977893ce_dc9d_42ee_9339_051be82076b8.slice/crio-bf4d7bbabe98d39d24b72dc8273324a169d0a08a13ff5f237f3861c530c693da WatchSource:0}: Error finding container bf4d7bbabe98d39d24b72dc8273324a169d0a08a13ff5f237f3861c530c693da: Status 404 returned error can't find the container with id bf4d7bbabe98d39d24b72dc8273324a169d0a08a13ff5f237f3861c530c693da Apr 22 18:42:00.828281 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:00.828236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hj4cl" event={"ID":"977893ce-dc9d-42ee-9339-051be82076b8","Type":"ContainerStarted","Data":"bf4d7bbabe98d39d24b72dc8273324a169d0a08a13ff5f237f3861c530c693da"} Apr 22 18:42:00.834843 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:00.834788 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65d8bf7b8-8xqkf" podUID="6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" containerName="console" containerID="cri-o://7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf" gracePeriod=15 Apr 22 18:42:01.077250 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.077223 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d8bf7b8-8xqkf_6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7/console/0.log" Apr 22 18:42:01.077387 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.077296 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:42:01.163881 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.163839 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-trusted-ca-bundle\") pod \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " Apr 22 18:42:01.164071 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.163892 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-config\") pod \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " Apr 22 18:42:01.164071 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.163940 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-service-ca\") pod \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " Apr 22 18:42:01.164071 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.163971 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n5fw\" (UniqueName: \"kubernetes.io/projected/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-kube-api-access-9n5fw\") pod \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " Apr 22 18:42:01.164234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164120 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-oauth-serving-cert\") pod \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " Apr 22 18:42:01.164234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164175 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-serving-cert\") pod \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " Apr 22 18:42:01.164234 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164206 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-oauth-config\") pod \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\" (UID: \"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7\") " Apr 22 18:42:01.164401 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164317 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" (UID: "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:42:01.164453 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164421 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-config" (OuterVolumeSpecName: "console-config") pod "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" (UID: "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:42:01.164516 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164494 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-trusted-ca-bundle\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:01.164568 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164519 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:01.164871 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164829 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-service-ca" (OuterVolumeSpecName: "service-ca") pod "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" (UID: "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:42:01.164871 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.164833 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" (UID: "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:42:01.166712 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.166687 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" (UID: "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:42:01.166849 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.166823 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-kube-api-access-9n5fw" (OuterVolumeSpecName: "kube-api-access-9n5fw") pod "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" (UID: "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7"). InnerVolumeSpecName "kube-api-access-9n5fw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:42:01.166954 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.166859 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" (UID: "6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:42:01.265611 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.265570 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-oauth-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:01.265611 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.265610 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-service-ca\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:01.265801 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.265626 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9n5fw\" (UniqueName: \"kubernetes.io/projected/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-kube-api-access-9n5fw\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:01.265801 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.265641 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-oauth-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:01.265801 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.265656 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7-console-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:01.831864 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.831835 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d8bf7b8-8xqkf_6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7/console/0.log" Apr 22 18:42:01.832313 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.831873 2572 generic.go:358] "Generic (PLEG): container finished" podID="6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" containerID="7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf" exitCode=2 Apr 22 18:42:01.832313 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.831945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d8bf7b8-8xqkf" event={"ID":"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7","Type":"ContainerDied","Data":"7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf"} Apr 22 18:42:01.832313 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.831971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d8bf7b8-8xqkf" event={"ID":"6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7","Type":"ContainerDied","Data":"e44ce6f2f6254d9c63e8072d70a6368986830d49f75fc210713df8ee6a402f04"} Apr 22 18:42:01.832313 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.831971 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d8bf7b8-8xqkf" Apr 22 18:42:01.832313 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.831985 2572 scope.go:117] "RemoveContainer" containerID="7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf" Apr 22 18:42:01.839521 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.839270 2572 scope.go:117] "RemoveContainer" containerID="7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf" Apr 22 18:42:01.839571 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:01.839521 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf\": container with ID starting with 7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf not found: ID does not exist" containerID="7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf" Apr 22 18:42:01.839571 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.839544 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf"} err="failed to get container status \"7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf\": rpc error: code = NotFound desc = could not find container \"7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf\": container with ID starting with 7fb25a70afec737a3c600e5bbf9e3dedb173a276fc487c9b6dde3e2bcffc38cf not found: ID does not exist" Apr 22 18:42:01.850913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.850882 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d8bf7b8-8xqkf"] Apr 22 18:42:01.854863 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:01.854842 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65d8bf7b8-8xqkf"] Apr 22 18:42:03.735778 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:03.735699 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" path="/var/lib/kubelet/pods/6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7/volumes" Apr 22 18:42:03.839868 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:03.839831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hj4cl" event={"ID":"977893ce-dc9d-42ee-9339-051be82076b8","Type":"ContainerStarted","Data":"8353f691f845edc98648456875fc56f83b8929060ccc615426c87b6c9711efcc"} Apr 22 18:42:03.857486 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:03.857431 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hj4cl" podStartSLOduration=1.299864785 podStartE2EDuration="4.857413987s" podCreationTimestamp="2026-04-22 18:41:59 +0000 UTC" firstStartedPulling="2026-04-22 18:41:59.927572324 +0000 UTC m=+354.786741022" lastFinishedPulling="2026-04-22 18:42:03.485121524 +0000 UTC m=+358.344290224" observedRunningTime="2026-04-22 18:42:03.856623113 +0000 UTC m=+358.715791832" watchObservedRunningTime="2026-04-22 18:42:03.857413987 +0000 UTC m=+358.716582705" Apr 22 18:42:18.901213 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.901182 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8"] Apr 22 18:42:18.901672 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.901498 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" containerName="console" Apr 22 18:42:18.901672 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.901509 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" containerName="console" Apr 22 18:42:18.901672 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.901568 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fa43ad2-9a88-4f96-a0eb-aa413f3e7ad7" containerName="console" Apr 22 18:42:18.906935 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.906918 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:18.909711 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.909693 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:42:18.910584 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.910568 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2zkfp\"" Apr 22 18:42:18.910635 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.910581 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:42:18.916222 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:18.916203 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8"] Apr 22 18:42:19.000797 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.000766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmsm\" (UniqueName: \"kubernetes.io/projected/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-kube-api-access-jdmsm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.000929 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.000810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.000929 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.000868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.102213 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.102185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmsm\" (UniqueName: \"kubernetes.io/projected/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-kube-api-access-jdmsm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.102323 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.102225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.102323 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.102252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.102649 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.102628 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.102706 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.102630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.110719 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.110700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmsm\" (UniqueName: \"kubernetes.io/projected/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-kube-api-access-jdmsm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.215822 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.215767 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:19.333494 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.333465 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8"] Apr 22 18:42:19.336621 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:42:19.336598 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9796ec_93aa_41aa_b7af_a1a8b6e29241.slice/crio-691989c74118f7f3aa3063ce7c0bad719a5291d42ddebb30313584dc03bd91f4 WatchSource:0}: Error finding container 691989c74118f7f3aa3063ce7c0bad719a5291d42ddebb30313584dc03bd91f4: Status 404 returned error can't find the container with id 691989c74118f7f3aa3063ce7c0bad719a5291d42ddebb30313584dc03bd91f4 Apr 22 18:42:19.886988 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:19.886953 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" event={"ID":"5c9796ec-93aa-41aa-b7af-a1a8b6e29241","Type":"ContainerStarted","Data":"691989c74118f7f3aa3063ce7c0bad719a5291d42ddebb30313584dc03bd91f4"} Apr 22 18:42:24.912558 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:24.912521 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerID="0714c4dbf00604c4dbf0af1a606898cb296e9d6e8759e2c0c834d7a2b7934432" exitCode=0 Apr 22 18:42:24.912951 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:24.912602 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" event={"ID":"5c9796ec-93aa-41aa-b7af-a1a8b6e29241","Type":"ContainerDied","Data":"0714c4dbf00604c4dbf0af1a606898cb296e9d6e8759e2c0c834d7a2b7934432"} Apr 22 18:42:29.928225 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:29.928174 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerID="69d713298345d89df8b3df4bc98768d0c256c07a297cc12e29417adf430ed089" exitCode=0 Apr 22 18:42:29.928751 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:29.928246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" event={"ID":"5c9796ec-93aa-41aa-b7af-a1a8b6e29241","Type":"ContainerDied","Data":"69d713298345d89df8b3df4bc98768d0c256c07a297cc12e29417adf430ed089"} Apr 22 18:42:41.961953 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:41.961921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" event={"ID":"5c9796ec-93aa-41aa-b7af-a1a8b6e29241","Type":"ContainerStarted","Data":"198cca7526be4ba39c459945746cdc4804dc102e846b462ba3efe1540b425770"} Apr 22 18:42:41.982185 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:41.982110 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" podStartSLOduration=1.43833087 podStartE2EDuration="23.982095469s" podCreationTimestamp="2026-04-22 18:42:18 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.338736607 +0000 UTC m=+374.197905305" lastFinishedPulling="2026-04-22 18:42:41.882501203 +0000 UTC m=+396.741669904" observedRunningTime="2026-04-22 18:42:41.981570999 +0000 UTC m=+396.840739719" watchObservedRunningTime="2026-04-22 18:42:41.982095469 +0000 UTC m=+396.841264188" Apr 22 18:42:42.965958 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:42.965927 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerID="198cca7526be4ba39c459945746cdc4804dc102e846b462ba3efe1540b425770" exitCode=0 Apr 22 18:42:42.966322 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:42.966017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" event={"ID":"5c9796ec-93aa-41aa-b7af-a1a8b6e29241","Type":"ContainerDied","Data":"198cca7526be4ba39c459945746cdc4804dc102e846b462ba3efe1540b425770"} Apr 22 18:42:44.086070 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.086043 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:44.214019 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.213986 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-bundle\") pod \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " Apr 22 18:42:44.214180 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.214049 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdmsm\" (UniqueName: \"kubernetes.io/projected/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-kube-api-access-jdmsm\") pod \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " Apr 22 18:42:44.214180 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.214072 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-util\") pod \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\" (UID: \"5c9796ec-93aa-41aa-b7af-a1a8b6e29241\") " Apr 22 18:42:44.214663 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.214641 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-bundle" (OuterVolumeSpecName: "bundle") pod "5c9796ec-93aa-41aa-b7af-a1a8b6e29241" (UID: "5c9796ec-93aa-41aa-b7af-a1a8b6e29241"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:42:44.216297 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.216266 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-kube-api-access-jdmsm" (OuterVolumeSpecName: "kube-api-access-jdmsm") pod "5c9796ec-93aa-41aa-b7af-a1a8b6e29241" (UID: "5c9796ec-93aa-41aa-b7af-a1a8b6e29241"). InnerVolumeSpecName "kube-api-access-jdmsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:42:44.218834 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.218811 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-util" (OuterVolumeSpecName: "util") pod "5c9796ec-93aa-41aa-b7af-a1a8b6e29241" (UID: "5c9796ec-93aa-41aa-b7af-a1a8b6e29241"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:42:44.315304 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.315268 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-bundle\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:44.315304 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.315292 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdmsm\" (UniqueName: \"kubernetes.io/projected/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-kube-api-access-jdmsm\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:44.315304 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.315303 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c9796ec-93aa-41aa-b7af-a1a8b6e29241-util\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:42:44.973906 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.973866 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" event={"ID":"5c9796ec-93aa-41aa-b7af-a1a8b6e29241","Type":"ContainerDied","Data":"691989c74118f7f3aa3063ce7c0bad719a5291d42ddebb30313584dc03bd91f4"} Apr 22 18:42:44.974079 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.973923 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="691989c74118f7f3aa3063ce7c0bad719a5291d42ddebb30313584dc03bd91f4" Apr 22 18:42:44.974079 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:44.973885 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4bx8" Apr 22 18:42:51.153070 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153032 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs"] Apr 22 18:42:51.153527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153371 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerName="pull" Apr 22 18:42:51.153527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153387 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerName="pull" Apr 22 18:42:51.153527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153397 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerName="extract" Apr 22 18:42:51.153527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153402 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerName="extract" Apr 22 18:42:51.153527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153428 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerName="util" Apr 22 18:42:51.153527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153434 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerName="util" Apr 22 18:42:51.153527 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.153486 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9796ec-93aa-41aa-b7af-a1a8b6e29241" containerName="extract" Apr 22 18:42:51.159295 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.159279 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.161812 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.161789 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:42:51.161812 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.161811 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-hjbwh\"" Apr 22 18:42:51.161989 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.161814 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:42:51.161989 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.161893 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:42:51.167202 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.167176 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs"] Apr 22 18:42:51.266625 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.266591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hq2\" (UniqueName: \"kubernetes.io/projected/57c23958-2e02-4d88-93dc-de637620d7e5-kube-api-access-52hq2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs\" (UID: \"57c23958-2e02-4d88-93dc-de637620d7e5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.266785 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.266700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/57c23958-2e02-4d88-93dc-de637620d7e5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs\" (UID: \"57c23958-2e02-4d88-93dc-de637620d7e5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.367515 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.367482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52hq2\" (UniqueName: \"kubernetes.io/projected/57c23958-2e02-4d88-93dc-de637620d7e5-kube-api-access-52hq2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs\" (UID: \"57c23958-2e02-4d88-93dc-de637620d7e5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.367629 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.367554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/57c23958-2e02-4d88-93dc-de637620d7e5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs\" (UID: \"57c23958-2e02-4d88-93dc-de637620d7e5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.369872 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.369849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/57c23958-2e02-4d88-93dc-de637620d7e5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs\" (UID: \"57c23958-2e02-4d88-93dc-de637620d7e5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.376648 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.376621 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hq2\" (UniqueName: \"kubernetes.io/projected/57c23958-2e02-4d88-93dc-de637620d7e5-kube-api-access-52hq2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs\" (UID: \"57c23958-2e02-4d88-93dc-de637620d7e5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.470133 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.470053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:51.590955 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.590914 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs"] Apr 22 18:42:51.594659 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:42:51.594634 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57c23958_2e02_4d88_93dc_de637620d7e5.slice/crio-cf2e726485f415fa6f902918a55393a667bfee678345facd10e6ddaf842342c1 WatchSource:0}: Error finding container cf2e726485f415fa6f902918a55393a667bfee678345facd10e6ddaf842342c1: Status 404 returned error can't find the container with id cf2e726485f415fa6f902918a55393a667bfee678345facd10e6ddaf842342c1 Apr 22 18:42:51.994408 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:51.994370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" event={"ID":"57c23958-2e02-4d88-93dc-de637620d7e5","Type":"ContainerStarted","Data":"cf2e726485f415fa6f902918a55393a667bfee678345facd10e6ddaf842342c1"} Apr 22 18:42:58.653873 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.653841 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lc4lk"] Apr 22 18:42:58.657033 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.657016 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.659517 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.659492 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:42:58.659671 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.659650 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:42:58.659799 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.659723 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-kvsvw\"" Apr 22 18:42:58.667932 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.667910 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lc4lk"] Apr 22 18:42:58.827563 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.827525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5cc5\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-kube-api-access-l5cc5\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.827778 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.827580 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.827778 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.827663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-cabundle0\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.929088 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.929004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5cc5\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-kube-api-access-l5cc5\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.929088 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.929056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.929250 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.929096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-cabundle0\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.929250 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:58.929221 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:42:58.929250 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:58.929239 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:42:58.929250 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:58.929248 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lc4lk: references non-existent secret key: ca.crt Apr 22 18:42:58.929404 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:58.929310 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates podName:cee90001-a1f0-47a5-942e-b79f2c0c6ab3 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:59.42929535 +0000 UTC m=+414.288464047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates") pod "keda-operator-ffbb595cb-lc4lk" (UID: "cee90001-a1f0-47a5-942e-b79f2c0c6ab3") : references non-existent secret key: ca.crt Apr 22 18:42:58.929717 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.929700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-cabundle0\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:58.953372 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:58.953346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5cc5\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-kube-api-access-l5cc5\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:59.019128 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.019091 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" event={"ID":"57c23958-2e02-4d88-93dc-de637620d7e5","Type":"ContainerStarted","Data":"5f3fe2fa945c9838a3d9ae15ad976858f4693cbe37742507fa80a53767cef0c3"} Apr 22 18:42:59.019301 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.019211 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:42:59.049096 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.049051 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" podStartSLOduration=1.557985336 podStartE2EDuration="8.04903875s" podCreationTimestamp="2026-04-22 18:42:51 +0000 UTC" firstStartedPulling="2026-04-22 18:42:51.596298507 +0000 UTC m=+406.455467204" lastFinishedPulling="2026-04-22 18:42:58.087351912 +0000 UTC m=+412.946520618" observedRunningTime="2026-04-22 18:42:59.048030463 +0000 UTC m=+413.907199182" watchObservedRunningTime="2026-04-22 18:42:59.04903875 +0000 UTC m=+413.908207469" Apr 22 18:42:59.301954 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.301880 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-fkfjk"] Apr 22 18:42:59.305177 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.305159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.307765 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.307732 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:42:59.314430 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.314409 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fkfjk"] Apr 22 18:42:59.433758 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.433707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmxr\" (UniqueName: \"kubernetes.io/projected/e8f5c55c-91d9-4eb8-941b-d848cb7a492c-kube-api-access-7gmxr\") pod \"keda-admission-cf49989db-fkfjk\" (UID: \"e8f5c55c-91d9-4eb8-941b-d848cb7a492c\") " pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.433955 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.433783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:42:59.433955 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.433811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8f5c55c-91d9-4eb8-941b-d848cb7a492c-certificates\") pod \"keda-admission-cf49989db-fkfjk\" (UID: \"e8f5c55c-91d9-4eb8-941b-d848cb7a492c\") " pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.434077 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:59.434021 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:42:59.434077 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:59.434037 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:42:59.434077 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:59.434047 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lc4lk: references non-existent secret key: ca.crt Apr 22 18:42:59.434216 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:42:59.434098 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates podName:cee90001-a1f0-47a5-942e-b79f2c0c6ab3 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:00.434079608 +0000 UTC m=+415.293248311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates") pod "keda-operator-ffbb595cb-lc4lk" (UID: "cee90001-a1f0-47a5-942e-b79f2c0c6ab3") : references non-existent secret key: ca.crt Apr 22 18:42:59.534671 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.534634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8f5c55c-91d9-4eb8-941b-d848cb7a492c-certificates\") pod \"keda-admission-cf49989db-fkfjk\" (UID: \"e8f5c55c-91d9-4eb8-941b-d848cb7a492c\") " pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.534851 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.534782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmxr\" (UniqueName: \"kubernetes.io/projected/e8f5c55c-91d9-4eb8-941b-d848cb7a492c-kube-api-access-7gmxr\") pod \"keda-admission-cf49989db-fkfjk\" (UID: \"e8f5c55c-91d9-4eb8-941b-d848cb7a492c\") " pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.539170 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.537846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e8f5c55c-91d9-4eb8-941b-d848cb7a492c-certificates\") pod \"keda-admission-cf49989db-fkfjk\" (UID: \"e8f5c55c-91d9-4eb8-941b-d848cb7a492c\") " pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.543941 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.543915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmxr\" (UniqueName: \"kubernetes.io/projected/e8f5c55c-91d9-4eb8-941b-d848cb7a492c-kube-api-access-7gmxr\") pod \"keda-admission-cf49989db-fkfjk\" (UID: \"e8f5c55c-91d9-4eb8-941b-d848cb7a492c\") " pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.617046 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.617011 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:42:59.760451 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:42:59.760430 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fkfjk"] Apr 22 18:42:59.762462 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:42:59.762433 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f5c55c_91d9_4eb8_941b_d848cb7a492c.slice/crio-955ef04b3a998cfc81e467ec287f43ad0bda37902bd5fa0283ab516f40513f0c WatchSource:0}: Error finding container 955ef04b3a998cfc81e467ec287f43ad0bda37902bd5fa0283ab516f40513f0c: Status 404 returned error can't find the container with id 955ef04b3a998cfc81e467ec287f43ad0bda37902bd5fa0283ab516f40513f0c Apr 22 18:43:00.023271 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:00.023185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fkfjk" event={"ID":"e8f5c55c-91d9-4eb8-941b-d848cb7a492c","Type":"ContainerStarted","Data":"955ef04b3a998cfc81e467ec287f43ad0bda37902bd5fa0283ab516f40513f0c"} Apr 22 18:43:00.442483 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:00.442450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:43:00.442686 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:00.442623 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:43:00.442686 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:00.442646 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:43:00.442686 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:00.442658 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lc4lk: references non-existent secret key: ca.crt Apr 22 18:43:00.442853 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:00.442724 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates podName:cee90001-a1f0-47a5-942e-b79f2c0c6ab3 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:02.442705323 +0000 UTC m=+417.301874028 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates") pod "keda-operator-ffbb595cb-lc4lk" (UID: "cee90001-a1f0-47a5-942e-b79f2c0c6ab3") : references non-existent secret key: ca.crt Apr 22 18:43:02.459716 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:02.459619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:43:02.460099 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:02.459786 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:43:02.460099 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:02.459805 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:43:02.460099 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:02.459814 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lc4lk: references non-existent secret key: ca.crt Apr 22 18:43:02.460099 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:43:02.459866 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates podName:cee90001-a1f0-47a5-942e-b79f2c0c6ab3 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:06.459852718 +0000 UTC m=+421.319021416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates") pod "keda-operator-ffbb595cb-lc4lk" (UID: "cee90001-a1f0-47a5-942e-b79f2c0c6ab3") : references non-existent secret key: ca.crt Apr 22 18:43:03.034605 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:03.034567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fkfjk" event={"ID":"e8f5c55c-91d9-4eb8-941b-d848cb7a492c","Type":"ContainerStarted","Data":"44c4c4e5293e2ef8a377d2bc81a74c969afd6b6af3d9bbdadfa9bbd175c13baa"} Apr 22 18:43:03.034771 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:03.034703 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:43:03.065419 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:03.065369 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-fkfjk" podStartSLOduration=1.7341104189999998 podStartE2EDuration="4.065355997s" podCreationTimestamp="2026-04-22 18:42:59 +0000 UTC" firstStartedPulling="2026-04-22 18:42:59.763859649 +0000 UTC m=+414.623028346" lastFinishedPulling="2026-04-22 18:43:02.095105221 +0000 UTC m=+416.954273924" observedRunningTime="2026-04-22 18:43:03.062940971 +0000 UTC m=+417.922109690" watchObservedRunningTime="2026-04-22 18:43:03.065355997 +0000 UTC m=+417.924524742" Apr 22 18:43:06.492407 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:06.492368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:43:06.494721 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:06.494693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cee90001-a1f0-47a5-942e-b79f2c0c6ab3-certificates\") pod \"keda-operator-ffbb595cb-lc4lk\" (UID: \"cee90001-a1f0-47a5-942e-b79f2c0c6ab3\") " pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:43:06.768377 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:06.768268 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:43:06.891938 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:06.891911 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lc4lk"] Apr 22 18:43:07.053412 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:07.053314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" event={"ID":"cee90001-a1f0-47a5-942e-b79f2c0c6ab3","Type":"ContainerStarted","Data":"8189e5bea78332bec3a633ac310e90a4bced82a903710506d9004c926fef703e"} Apr 22 18:43:13.073513 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:13.073474 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" event={"ID":"cee90001-a1f0-47a5-942e-b79f2c0c6ab3","Type":"ContainerStarted","Data":"a3b479fb526614dbf66685f779325a77c370ca65998bd10d0c14427ae9013cc0"} Apr 22 18:43:13.073875 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:13.073595 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:43:13.091686 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:13.091637 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" podStartSLOduration=9.13640943 podStartE2EDuration="15.091619131s" podCreationTimestamp="2026-04-22 18:42:58 +0000 UTC" firstStartedPulling="2026-04-22 18:43:06.896838754 +0000 UTC m=+421.756007464" lastFinishedPulling="2026-04-22 18:43:12.852048466 +0000 UTC m=+427.711217165" observedRunningTime="2026-04-22 18:43:13.090784619 +0000 UTC m=+427.949953341" watchObservedRunningTime="2026-04-22 18:43:13.091619131 +0000 UTC m=+427.950787851" Apr 22 18:43:20.025847 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:20.025804 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6qpbs" Apr 22 18:43:24.040829 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:24.040798 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-fkfjk" Apr 22 18:43:34.078724 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:43:34.078690 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-lc4lk" Apr 22 18:44:04.219809 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.219774 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-7srt6"] Apr 22 18:44:04.222289 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.222270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:04.224783 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.224757 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:44:04.224900 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.224756 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:44:04.224900 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.224856 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:44:04.225772 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.225752 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-lq2l9\"" Apr 22 18:44:04.231710 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.231688 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-7srt6"] Apr 22 18:44:04.343772 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.343737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qccfv\" (UniqueName: \"kubernetes.io/projected/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-kube-api-access-qccfv\") pod \"llmisvc-controller-manager-68cc5db7c4-7srt6\" (UID: \"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:04.343772 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.343783 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-7srt6\" (UID: \"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:04.444393 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.444358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qccfv\" (UniqueName: \"kubernetes.io/projected/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-kube-api-access-qccfv\") pod \"llmisvc-controller-manager-68cc5db7c4-7srt6\" (UID: \"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:04.444393 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.444395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-7srt6\" (UID: \"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:04.444596 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:44:04.444495 2572 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 18:44:04.444596 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:44:04.444557 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-cert podName:d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a nodeName:}" failed. No retries permitted until 2026-04-22 18:44:04.944542376 +0000 UTC m=+479.803711074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-cert") pod "llmisvc-controller-manager-68cc5db7c4-7srt6" (UID: "d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a") : secret "llmisvc-webhook-server-cert" not found Apr 22 18:44:04.454957 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.454927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qccfv\" (UniqueName: \"kubernetes.io/projected/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-kube-api-access-qccfv\") pod \"llmisvc-controller-manager-68cc5db7c4-7srt6\" (UID: \"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:04.948030 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.947996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-7srt6\" (UID: \"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:04.950359 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:04.950317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-7srt6\" (UID: \"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:05.133052 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:05.133013 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:05.254502 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:05.254469 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-7srt6"] Apr 22 18:44:05.257635 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:44:05.257603 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd7774e6e_f64a_4dae_aca0_ca8aeafa8d9a.slice/crio-c98e51abfeb51222f3827a0092d35fed254bf78a9180b236f4eb2ab76050764b WatchSource:0}: Error finding container c98e51abfeb51222f3827a0092d35fed254bf78a9180b236f4eb2ab76050764b: Status 404 returned error can't find the container with id c98e51abfeb51222f3827a0092d35fed254bf78a9180b236f4eb2ab76050764b Apr 22 18:44:06.242758 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:06.242721 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" event={"ID":"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a","Type":"ContainerStarted","Data":"c98e51abfeb51222f3827a0092d35fed254bf78a9180b236f4eb2ab76050764b"} Apr 22 18:44:08.251747 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:08.251709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" event={"ID":"d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a","Type":"ContainerStarted","Data":"21b70b1e316ee3a551f27614f62534def8cdbb2c6d2b2d2673e68c098e6521c1"} Apr 22 18:44:08.252131 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:08.251844 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:44:08.282080 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:08.282025 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" podStartSLOduration=2.409193921 podStartE2EDuration="4.282008902s" podCreationTimestamp="2026-04-22 18:44:04 +0000 UTC" firstStartedPulling="2026-04-22 18:44:05.258903915 +0000 UTC m=+480.118072613" lastFinishedPulling="2026-04-22 18:44:07.131718892 +0000 UTC m=+481.990887594" observedRunningTime="2026-04-22 18:44:08.280764508 +0000 UTC m=+483.139933228" watchObservedRunningTime="2026-04-22 18:44:08.282008902 +0000 UTC m=+483.141177623" Apr 22 18:44:39.258438 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:44:39.258397 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7srt6" Apr 22 18:45:13.827813 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:13.827772 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-xthpz"] Apr 22 18:45:13.830292 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:13.830270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:13.832966 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:13.832940 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-l7xn4\"" Apr 22 18:45:13.833125 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:13.833102 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:45:13.844490 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:13.844467 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xthpz"] Apr 22 18:45:13.989015 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:13.988978 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79c37010-87f8-4127-9793-8e12c429dc67-tls-certs\") pod \"model-serving-api-86f7b4b499-xthpz\" (UID: \"79c37010-87f8-4127-9793-8e12c429dc67\") " pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:13.989208 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:13.989021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7fm5\" (UniqueName: \"kubernetes.io/projected/79c37010-87f8-4127-9793-8e12c429dc67-kube-api-access-m7fm5\") pod \"model-serving-api-86f7b4b499-xthpz\" (UID: \"79c37010-87f8-4127-9793-8e12c429dc67\") " pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:14.089617 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:14.089586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79c37010-87f8-4127-9793-8e12c429dc67-tls-certs\") pod \"model-serving-api-86f7b4b499-xthpz\" (UID: \"79c37010-87f8-4127-9793-8e12c429dc67\") " pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:14.089753 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:14.089643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7fm5\" (UniqueName: \"kubernetes.io/projected/79c37010-87f8-4127-9793-8e12c429dc67-kube-api-access-m7fm5\") pod \"model-serving-api-86f7b4b499-xthpz\" (UID: \"79c37010-87f8-4127-9793-8e12c429dc67\") " pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:14.092031 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:14.091998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79c37010-87f8-4127-9793-8e12c429dc67-tls-certs\") pod \"model-serving-api-86f7b4b499-xthpz\" (UID: \"79c37010-87f8-4127-9793-8e12c429dc67\") " pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:14.098639 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:14.098619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7fm5\" (UniqueName: \"kubernetes.io/projected/79c37010-87f8-4127-9793-8e12c429dc67-kube-api-access-m7fm5\") pod \"model-serving-api-86f7b4b499-xthpz\" (UID: \"79c37010-87f8-4127-9793-8e12c429dc67\") " pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:14.143556 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:14.143523 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:14.262824 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:14.262776 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xthpz"] Apr 22 18:45:14.265191 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:45:14.265161 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c37010_87f8_4127_9793_8e12c429dc67.slice/crio-277c4a0e2454db059f82c09b83421838389e0fd884a26048a0fdb26a5a9d9069 WatchSource:0}: Error finding container 277c4a0e2454db059f82c09b83421838389e0fd884a26048a0fdb26a5a9d9069: Status 404 returned error can't find the container with id 277c4a0e2454db059f82c09b83421838389e0fd884a26048a0fdb26a5a9d9069 Apr 22 18:45:14.460243 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:14.460153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xthpz" event={"ID":"79c37010-87f8-4127-9793-8e12c429dc67","Type":"ContainerStarted","Data":"277c4a0e2454db059f82c09b83421838389e0fd884a26048a0fdb26a5a9d9069"} Apr 22 18:45:17.474392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:17.474353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xthpz" event={"ID":"79c37010-87f8-4127-9793-8e12c429dc67","Type":"ContainerStarted","Data":"4bd95e33a8ebb345145bdd654f6c09c0e8270fe8b0a797234fd64fb3fd2f7434"} Apr 22 18:45:17.474815 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:17.474466 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:17.491827 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:17.491778 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-xthpz" podStartSLOduration=2.340369177 podStartE2EDuration="4.49176508s" podCreationTimestamp="2026-04-22 18:45:13 +0000 UTC" firstStartedPulling="2026-04-22 18:45:14.267000971 +0000 UTC m=+549.126169669" lastFinishedPulling="2026-04-22 18:45:16.418396873 +0000 UTC m=+551.277565572" observedRunningTime="2026-04-22 18:45:17.490418316 +0000 UTC m=+552.349587037" watchObservedRunningTime="2026-04-22 18:45:17.49176508 +0000 UTC m=+552.350933800" Apr 22 18:45:18.978730 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:18.978697 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-656996d56b-rld47"] Apr 22 18:45:18.981425 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:18.981399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:18.993948 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:18.993919 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-656996d56b-rld47"] Apr 22 18:45:19.134121 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.134088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-trusted-ca-bundle\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.134121 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.134122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-service-ca\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.134325 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.134145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-serving-cert\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.134325 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.134168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-oauth-serving-cert\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.134325 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.134226 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2gmg\" (UniqueName: \"kubernetes.io/projected/863732cc-12b6-449b-b6b2-6e9f2a06d47f-kube-api-access-s2gmg\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.134325 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.134293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-config\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.134486 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.134384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-oauth-config\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.234977 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.234906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-oauth-config\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.234977 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.234945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-trusted-ca-bundle\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.234977 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.234963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-service-ca\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.235255 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.234990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-serving-cert\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.235255 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.235134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-oauth-serving-cert\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.235255 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.235174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2gmg\" (UniqueName: \"kubernetes.io/projected/863732cc-12b6-449b-b6b2-6e9f2a06d47f-kube-api-access-s2gmg\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.235255 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.235218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-config\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.235837 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.235814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-service-ca\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.235965 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.235940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-oauth-serving-cert\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.236005 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.235965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-trusted-ca-bundle\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.236039 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.236015 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-config\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.237588 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.237560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-oauth-config\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.237687 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.237568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/863732cc-12b6-449b-b6b2-6e9f2a06d47f-console-serving-cert\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.244133 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.244109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2gmg\" (UniqueName: \"kubernetes.io/projected/863732cc-12b6-449b-b6b2-6e9f2a06d47f-kube-api-access-s2gmg\") pod \"console-656996d56b-rld47\" (UID: \"863732cc-12b6-449b-b6b2-6e9f2a06d47f\") " pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.291788 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.291764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:19.414845 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.414822 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-656996d56b-rld47"] Apr 22 18:45:19.416902 ip-10-0-131-5 kubenswrapper[2572]: W0422 18:45:19.416872 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod863732cc_12b6_449b_b6b2_6e9f2a06d47f.slice/crio-7a5fb5816257efe108437b68326460fafb9addbc10dad96d4f8103d028ecee8e WatchSource:0}: Error finding container 7a5fb5816257efe108437b68326460fafb9addbc10dad96d4f8103d028ecee8e: Status 404 returned error can't find the container with id 7a5fb5816257efe108437b68326460fafb9addbc10dad96d4f8103d028ecee8e Apr 22 18:45:19.483977 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.483946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-656996d56b-rld47" event={"ID":"863732cc-12b6-449b-b6b2-6e9f2a06d47f","Type":"ContainerStarted","Data":"e0c7d8730d5aee2d4fec6fbfa17a904e7ee8ae32c7f866f1580df71538f00ebc"} Apr 22 18:45:19.484107 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.483984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-656996d56b-rld47" event={"ID":"863732cc-12b6-449b-b6b2-6e9f2a06d47f","Type":"ContainerStarted","Data":"7a5fb5816257efe108437b68326460fafb9addbc10dad96d4f8103d028ecee8e"} Apr 22 18:45:19.502861 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:19.502776 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-656996d56b-rld47" podStartSLOduration=1.502762497 podStartE2EDuration="1.502762497s" podCreationTimestamp="2026-04-22 18:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:19.501719759 +0000 UTC m=+554.360888484" watchObservedRunningTime="2026-04-22 18:45:19.502762497 +0000 UTC m=+554.361931216" Apr 22 18:45:28.482392 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:28.482359 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-xthpz" Apr 22 18:45:29.292402 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:29.292350 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:29.292402 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:29.292407 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:29.297193 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:29.297168 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:29.520514 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:29.520484 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-656996d56b-rld47" Apr 22 18:45:29.578789 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:29.578703 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cff5ccc4-mnt5j"] Apr 22 18:45:54.601107 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.601068 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cff5ccc4-mnt5j" podUID="08ddac11-7a61-46c9-bd54-bbc43caba02f" containerName="console" containerID="cri-o://375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6" gracePeriod=15 Apr 22 18:45:54.845913 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.845889 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cff5ccc4-mnt5j_08ddac11-7a61-46c9-bd54-bbc43caba02f/console/0.log" Apr 22 18:45:54.846004 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.845954 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:45:54.907652 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.907590 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-serving-cert\") pod \"08ddac11-7a61-46c9-bd54-bbc43caba02f\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " Apr 22 18:45:54.907652 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.907633 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pdbd\" (UniqueName: \"kubernetes.io/projected/08ddac11-7a61-46c9-bd54-bbc43caba02f-kube-api-access-7pdbd\") pod \"08ddac11-7a61-46c9-bd54-bbc43caba02f\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " Apr 22 18:45:54.907826 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.907672 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-oauth-serving-cert\") pod \"08ddac11-7a61-46c9-bd54-bbc43caba02f\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " Apr 22 18:45:54.907826 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.907713 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-trusted-ca-bundle\") pod \"08ddac11-7a61-46c9-bd54-bbc43caba02f\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " Apr 22 18:45:54.907826 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.907751 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-config\") pod \"08ddac11-7a61-46c9-bd54-bbc43caba02f\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " Apr 22 18:45:54.907826 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.907784 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-oauth-config\") pod \"08ddac11-7a61-46c9-bd54-bbc43caba02f\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " Apr 22 18:45:54.907826 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.907814 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-service-ca\") pod \"08ddac11-7a61-46c9-bd54-bbc43caba02f\" (UID: \"08ddac11-7a61-46c9-bd54-bbc43caba02f\") " Apr 22 18:45:54.908158 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.908132 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "08ddac11-7a61-46c9-bd54-bbc43caba02f" (UID: "08ddac11-7a61-46c9-bd54-bbc43caba02f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:54.908267 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.908155 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-config" (OuterVolumeSpecName: "console-config") pod "08ddac11-7a61-46c9-bd54-bbc43caba02f" (UID: "08ddac11-7a61-46c9-bd54-bbc43caba02f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:54.908317 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.908243 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "08ddac11-7a61-46c9-bd54-bbc43caba02f" (UID: "08ddac11-7a61-46c9-bd54-bbc43caba02f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:54.908390 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.908313 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-service-ca" (OuterVolumeSpecName: "service-ca") pod "08ddac11-7a61-46c9-bd54-bbc43caba02f" (UID: "08ddac11-7a61-46c9-bd54-bbc43caba02f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:54.909903 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.909879 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ddac11-7a61-46c9-bd54-bbc43caba02f-kube-api-access-7pdbd" (OuterVolumeSpecName: "kube-api-access-7pdbd") pod "08ddac11-7a61-46c9-bd54-bbc43caba02f" (UID: "08ddac11-7a61-46c9-bd54-bbc43caba02f"). InnerVolumeSpecName "kube-api-access-7pdbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:54.910235 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.910216 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "08ddac11-7a61-46c9-bd54-bbc43caba02f" (UID: "08ddac11-7a61-46c9-bd54-bbc43caba02f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:54.910358 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:54.910311 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "08ddac11-7a61-46c9-bd54-bbc43caba02f" (UID: "08ddac11-7a61-46c9-bd54-bbc43caba02f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:55.009236 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.009211 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-trusted-ca-bundle\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:45:55.009236 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.009233 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:45:55.009376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.009243 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-oauth-config\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:45:55.009376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.009252 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-service-ca\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:45:55.009376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.009260 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08ddac11-7a61-46c9-bd54-bbc43caba02f-console-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:45:55.009376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.009270 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7pdbd\" (UniqueName: \"kubernetes.io/projected/08ddac11-7a61-46c9-bd54-bbc43caba02f-kube-api-access-7pdbd\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:45:55.009376 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.009278 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08ddac11-7a61-46c9-bd54-bbc43caba02f-oauth-serving-cert\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Apr 22 18:45:55.600728 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.600704 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cff5ccc4-mnt5j_08ddac11-7a61-46c9-bd54-bbc43caba02f/console/0.log" Apr 22 18:45:55.600885 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.600742 2572 generic.go:358] "Generic (PLEG): container finished" podID="08ddac11-7a61-46c9-bd54-bbc43caba02f" containerID="375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6" exitCode=2 Apr 22 18:45:55.600885 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.600773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cff5ccc4-mnt5j" event={"ID":"08ddac11-7a61-46c9-bd54-bbc43caba02f","Type":"ContainerDied","Data":"375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6"} Apr 22 18:45:55.600885 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.600804 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cff5ccc4-mnt5j" Apr 22 18:45:55.600885 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.600815 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cff5ccc4-mnt5j" event={"ID":"08ddac11-7a61-46c9-bd54-bbc43caba02f","Type":"ContainerDied","Data":"50dd3fa7bc98ff65bac3935412ca6f3aa41b132a83f1a09fd95364515554be5e"} Apr 22 18:45:55.600885 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.600830 2572 scope.go:117] "RemoveContainer" containerID="375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6" Apr 22 18:45:55.609597 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.609393 2572 scope.go:117] "RemoveContainer" containerID="375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6" Apr 22 18:45:55.609799 ip-10-0-131-5 kubenswrapper[2572]: E0422 18:45:55.609613 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6\": container with ID starting with 375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6 not found: ID does not exist" containerID="375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6" Apr 22 18:45:55.609799 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.609638 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6"} err="failed to get container status \"375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6\": rpc error: code = NotFound desc = could not find container \"375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6\": container with ID starting with 375d1d2300d45358b728921675a4727bb6ef8b6c9c4be35134762dbbc5ca32f6 not found: ID does not exist" Apr 22 18:45:55.621896 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.621874 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cff5ccc4-mnt5j"] Apr 22 18:45:55.626303 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.626282 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cff5ccc4-mnt5j"] Apr 22 18:45:55.741420 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:45:55.741393 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ddac11-7a61-46c9-bd54-bbc43caba02f" path="/var/lib/kubelet/pods/08ddac11-7a61-46c9-bd54-bbc43caba02f/volumes" Apr 22 18:46:05.620240 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:46:05.620200 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:46:05.620761 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:46:05.620615 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:46:05.623791 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:46:05.623771 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:46:05.624643 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:46:05.624623 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:51:05.641808 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:51:05.641768 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:51:05.643093 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:51:05.643071 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:51:05.645660 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:51:05.645640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:51:05.646603 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:51:05.646584 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:56:05.663666 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:56:05.663631 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:56:05.666148 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:56:05.665728 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 18:56:05.667541 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:56:05.667521 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 18:56:05.669599 ip-10-0-131-5 kubenswrapper[2572]: I0422 18:56:05.669574 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:01:05.685693 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:01:05.685581 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:01:05.689607 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:01:05.688581 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:01:05.690091 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:01:05.690072 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:01:05.692910 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:01:05.692888 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:06:05.709914 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:06:05.709797 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:06:05.714944 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:06:05.713821 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:06:05.714944 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:06:05.714831 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:06:05.718241 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:06:05.718225 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:11:05.733842 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:11:05.733728 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:11:05.738216 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:11:05.738196 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:11:05.738908 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:11:05.738890 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:11:05.742875 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:11:05.742860 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:16:05.755812 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:16:05.755709 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:16:05.760060 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:16:05.760040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:16:05.761694 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:16:05.761670 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:16:05.765668 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:16:05.765645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:21:05.777953 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:21:05.777840 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:21:05.782114 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:21:05.782095 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:21:05.784677 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:21:05.784655 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:21:05.788391 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:21:05.788375 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:26:05.799529 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:26:05.799415 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:26:05.803556 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:26:05.803477 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:26:05.808065 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:26:05.808046 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:26:05.812002 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:26:05.811983 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:31:05.821499 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:31:05.821393 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:31:05.826039 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:31:05.826019 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:31:05.834368 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:31:05.834347 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:31:05.838413 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:31:05.838396 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:36:05.844568 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:36:05.844529 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:36:05.849017 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:36:05.848993 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:36:05.856754 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:36:05.856734 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:36:05.860601 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:36:05.860581 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:41:05.866095 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:41:05.865994 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:41:05.870355 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:41:05.870316 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:41:05.879239 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:41:05.879221 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:41:05.883355 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:41:05.883337 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:46:05.888087 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:05.887974 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:46:05.892341 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:05.892310 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:46:05.902524 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:05.902503 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:46:05.906744 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:05.906719 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:46:26.255340 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:26.255240 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hj4cl_977893ce-dc9d-42ee-9339-051be82076b8/global-pull-secret-syncer/0.log" Apr 22 19:46:26.454170 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:26.454141 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wlpch_915264b6-6df0-4100-9a03-985c5f546a4b/konnectivity-agent/0.log" Apr 22 19:46:26.474869 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:26.474831 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-5.ec2.internal_55d83b29c984d704f2c407ca2173be08/haproxy/0.log" Apr 22 19:46:30.439619 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:30.439587 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jt2mg_b5a914ec-c1b6-4b17-a510-d0ca4c4348f3/node-exporter/0.log" Apr 22 19:46:30.461898 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:30.461870 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jt2mg_b5a914ec-c1b6-4b17-a510-d0ca4c4348f3/kube-rbac-proxy/0.log" Apr 22 19:46:30.482539 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:30.482518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jt2mg_b5a914ec-c1b6-4b17-a510-d0ca4c4348f3/init-textfile/0.log" Apr 22 19:46:30.837530 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:30.837443 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-62ghg_03fd38f5-5a27-45c2-8958-0dae07e467ee/prometheus-operator/0.log" Apr 22 19:46:30.855853 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:30.855818 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-62ghg_03fd38f5-5a27-45c2-8958-0dae07e467ee/kube-rbac-proxy/0.log" Apr 22 19:46:30.880531 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:30.880504 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-cmgvl_49a8da23-e075-4aff-a4a7-44232fb3d61f/prometheus-operator-admission-webhook/0.log" Apr 22 19:46:30.983591 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:30.983565 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/thanos-query/0.log" Apr 22 19:46:31.002563 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:31.002530 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy-web/0.log" Apr 22 19:46:31.021772 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:31.021749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy/0.log" Apr 22 19:46:31.042994 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:31.042968 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/prom-label-proxy/0.log" Apr 22 19:46:31.068571 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:31.068544 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy-rules/0.log" Apr 22 19:46:31.091257 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:31.091236 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fcff6d6bc-744gl_a7d8f602-2ff4-4ef2-9216-162f4e272e2f/kube-rbac-proxy-metrics/0.log" Apr 22 19:46:32.230649 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:32.230612 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fmqhj_89e53d0a-d054-4c95-b501-048e1450ca72/networking-console-plugin/0.log" Apr 22 19:46:32.646205 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:32.646170 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/2.log" Apr 22 19:46:32.654821 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:32.654797 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6brbj_60e694f5-420a-4a93-b793-b951e02e4c81/console-operator/3.log" Apr 22 19:46:33.002016 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.001934 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-656996d56b-rld47_863732cc-12b6-449b-b6b2-6e9f2a06d47f/console/0.log" Apr 22 19:46:33.340226 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.340195 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx"] Apr 22 19:46:33.340611 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.340530 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08ddac11-7a61-46c9-bd54-bbc43caba02f" containerName="console" Apr 22 19:46:33.340611 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.340541 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ddac11-7a61-46c9-bd54-bbc43caba02f" containerName="console" Apr 22 19:46:33.340611 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.340610 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="08ddac11-7a61-46c9-bd54-bbc43caba02f" containerName="console" Apr 22 19:46:33.343611 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.343594 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.346252 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.346227 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5hs9k\"/\"openshift-service-ca.crt\"" Apr 22 19:46:33.346392 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.346232 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5hs9k\"/\"kube-root-ca.crt\"" Apr 22 19:46:33.347021 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.347006 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5hs9k\"/\"default-dockercfg-9t4h8\"" Apr 22 19:46:33.350074 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.350054 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx"] Apr 22 19:46:33.405994 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.405961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-sys\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.406156 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.406001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-proc\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.406156 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.406027 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-podres\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.406156 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.406047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-lib-modules\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.406156 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.406104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mw2d\" (UniqueName: \"kubernetes.io/projected/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-kube-api-access-9mw2d\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.472522 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.472485 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-8z4qp_7e46d4ab-f18c-4fbb-b659-be241b0d7c69/volume-data-source-validator/0.log" Apr 22 19:46:33.506708 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-sys\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.506855 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-proc\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.506855 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-podres\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.506855 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506803 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-sys\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.506855 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-proc\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.506855 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-lib-modules\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.507064 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506866 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mw2d\" (UniqueName: \"kubernetes.io/projected/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-kube-api-access-9mw2d\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.507064 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-lib-modules\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.507064 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.506920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-podres\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.514917 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.514886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mw2d\" (UniqueName: \"kubernetes.io/projected/fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4-kube-api-access-9mw2d\") pod \"perf-node-gather-daemonset-9vwgx\" (UID: \"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.654169 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.654072 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:33.775607 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.775577 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx"] Apr 22 19:46:33.778062 ip-10-0-131-5 kubenswrapper[2572]: W0422 19:46:33.778025 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfc7826a2_21b5_451c_bb5b_8e8c05b3e2a4.slice/crio-fbb1d27209e6c85a939ff5c38a90c23f6e0532c422b4cfafc1297ea62be7ca5b WatchSource:0}: Error finding container fbb1d27209e6c85a939ff5c38a90c23f6e0532c422b4cfafc1297ea62be7ca5b: Status 404 returned error can't find the container with id fbb1d27209e6c85a939ff5c38a90c23f6e0532c422b4cfafc1297ea62be7ca5b Apr 22 19:46:33.779684 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:33.779664 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:46:34.229515 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.229487 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ntjbb_57975d7c-6756-4dde-9d27-faa3e96cc6f5/dns/0.log" Apr 22 19:46:34.262436 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.262403 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ntjbb_57975d7c-6756-4dde-9d27-faa3e96cc6f5/kube-rbac-proxy/0.log" Apr 22 19:46:34.374272 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.374247 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hjctd_d1294e5e-31d1-48a2-8134-4d7b0f658d42/dns-node-resolver/0.log" Apr 22 19:46:34.377711 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.377687 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" event={"ID":"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4","Type":"ContainerStarted","Data":"c8351605f7b294d36bd86704651b9e80236614a4398cba1b522c5e18a7ccb9be"} Apr 22 19:46:34.377831 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.377718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" event={"ID":"fc7826a2-21b5-451c-bb5b-8e8c05b3e2a4","Type":"ContainerStarted","Data":"fbb1d27209e6c85a939ff5c38a90c23f6e0532c422b4cfafc1297ea62be7ca5b"} Apr 22 19:46:34.377831 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.377815 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:34.395233 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.395193 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" podStartSLOduration=1.395179455 podStartE2EDuration="1.395179455s" podCreationTimestamp="2026-04-22 19:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:46:34.39436894 +0000 UTC m=+4229.253537654" watchObservedRunningTime="2026-04-22 19:46:34.395179455 +0000 UTC m=+4229.254348174" Apr 22 19:46:34.786962 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.786928 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5b7b79bf9d-fsxj6_0f9f437f-bb3e-48a8-9703-4e84916595f7/registry/0.log" Apr 22 19:46:34.857984 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:34.857945 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tpthn_d8425f70-4f14-4d86-b30e-3abe38269764/node-ca/0.log" Apr 22 19:46:35.512786 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:35.512749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69c74fc656-2f57x_1567a865-78f8-433b-a4dd-e7478597180f/router/0.log" Apr 22 19:46:35.841939 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:35.841913 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2phz6_f0ecf33d-061b-4ba1-9f1e-ec8f458b1027/serve-healthcheck-canary/0.log" Apr 22 19:46:36.233411 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:36.233289 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gs4mb_e949ed90-0ea2-43e9-8cbc-ae1bec9390c9/insights-operator/0.log" Apr 22 19:46:36.237706 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:36.237680 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gs4mb_e949ed90-0ea2-43e9-8cbc-ae1bec9390c9/insights-operator/1.log" Apr 22 19:46:36.315762 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:36.315735 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bkxxm_38ed131f-ec9b-4691-91ca-1438655083f8/kube-rbac-proxy/0.log" Apr 22 19:46:36.335414 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:36.335390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bkxxm_38ed131f-ec9b-4691-91ca-1438655083f8/exporter/0.log" Apr 22 19:46:36.357490 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:36.357465 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bkxxm_38ed131f-ec9b-4691-91ca-1438655083f8/extractor/0.log" Apr 22 19:46:38.300698 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:38.300666 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-7srt6_d7774e6e-f64a-4dae-aca0-ca8aeafa8d9a/manager/0.log" Apr 22 19:46:38.320787 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:38.320759 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-xthpz_79c37010-87f8-4127-9793-8e12c429dc67/server/0.log" Apr 22 19:46:40.390916 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:40.390886 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-9vwgx" Apr 22 19:46:42.838549 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:42.838445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k7ljr_89183697-99ab-489f-95ef-9654164feac8/kube-storage-version-migrator-operator/1.log" Apr 22 19:46:42.840364 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:42.840308 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k7ljr_89183697-99ab-489f-95ef-9654164feac8/kube-storage-version-migrator-operator/0.log" Apr 22 19:46:44.024157 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.024131 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zw6cn_3a24b441-1f95-45b3-b520-483d996f771f/kube-multus-additional-cni-plugins/0.log" Apr 22 19:46:44.046754 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.046728 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zw6cn_3a24b441-1f95-45b3-b520-483d996f771f/egress-router-binary-copy/0.log" Apr 22 19:46:44.066436 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.066412 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zw6cn_3a24b441-1f95-45b3-b520-483d996f771f/cni-plugins/0.log" Apr 22 19:46:44.086755 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.086729 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zw6cn_3a24b441-1f95-45b3-b520-483d996f771f/bond-cni-plugin/0.log" Apr 22 19:46:44.105942 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.105915 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zw6cn_3a24b441-1f95-45b3-b520-483d996f771f/routeoverride-cni/0.log" Apr 22 19:46:44.125140 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.125109 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zw6cn_3a24b441-1f95-45b3-b520-483d996f771f/whereabouts-cni-bincopy/0.log" Apr 22 19:46:44.147144 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.147118 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zw6cn_3a24b441-1f95-45b3-b520-483d996f771f/whereabouts-cni/0.log" Apr 22 19:46:44.209803 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.209781 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl675_9a7f054c-e2d0-4250-be22-6160ebb37eec/kube-multus/0.log" Apr 22 19:46:44.298884 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.298808 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k7crw_fff77f0b-c2fb-4acb-b894-ce916d7cf9d2/network-metrics-daemon/0.log" Apr 22 19:46:44.316763 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:44.316739 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k7crw_fff77f0b-c2fb-4acb-b894-ce916d7cf9d2/kube-rbac-proxy/0.log" Apr 22 19:46:45.703773 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.703743 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-controller/0.log" Apr 22 19:46:45.719248 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.719207 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/0.log" Apr 22 19:46:45.755955 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.755924 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovn-acl-logging/1.log" Apr 22 19:46:45.776537 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.776507 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/kube-rbac-proxy-node/0.log" Apr 22 19:46:45.795992 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.795966 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:46:45.812063 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.812021 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/northd/0.log" Apr 22 19:46:45.831054 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.831031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/nbdb/0.log" Apr 22 19:46:45.851218 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:45.851195 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/sbdb/0.log" Apr 22 19:46:46.013508 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:46.013428 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vjgnk_62a2a2c8-4324-4276-a6c1-57c3f81c4b5c/ovnkube-controller/0.log" Apr 22 19:46:46.978668 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:46.978629 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5h8ps_516d0b19-b6db-46c2-9865-24e9c2e844fc/network-check-target-container/0.log" Apr 22 19:46:47.975217 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:47.975182 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5vfs6_06291473-0b0d-41e0-99f1-3d887d31c55e/iptables-alerter/0.log" Apr 22 19:46:48.598191 ip-10-0-131-5 kubenswrapper[2572]: I0422 19:46:48.598159 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-p4kkw_83267a92-55fb-45ae-8856-cfb92fa1ca05/tuned/0.log"