Apr 22 17:33:33.398717 ip-10-0-131-22 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:33:33.910817 ip-10-0-131-22 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:33:33.910817 ip-10-0-131-22 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:33:33.910817 ip-10-0-131-22 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:33:33.910817 ip-10-0-131-22 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:33:33.910817 ip-10-0-131-22 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:33:33.913486 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.913374 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:33:33.920502 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920481 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:33:33.920502 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920499 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:33:33.920502 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920504 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920508 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920512 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920515 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920518 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920521 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920524 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920527 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920531 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920535 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920538 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920541 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920544 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920547 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920550 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920554 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920557 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920560 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920563 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:33:33.920600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920566 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920569 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920572 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920575 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920577 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920580 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920583 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920585 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920587 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920590 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920593 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920595 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920598 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920601 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920604 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920607 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920610 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920613 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920616 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920620 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:33:33.921057 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920622 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920625 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920627 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920630 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920632 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920635 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920637 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920640 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920643 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920646 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920648 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920651 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920653 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920656 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920659 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920662 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920665 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920667 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920670 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920674 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:33:33.921614 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920677 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920680 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920684 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920686 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920689 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920691 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920694 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920696 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920699 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920702 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920706 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920708 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920711 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920713 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920716 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920719 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920721 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920724 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920727 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920730 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:33:33.922096 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920733 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920735 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920738 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920741 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.920743 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921160 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921168 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921171 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921173 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921176 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921179 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921181 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921184 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921187 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921190 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921193 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921195 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921198 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921200 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:33:33.922600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921203 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921206 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921208 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921210 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921213 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921215 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921218 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921220 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921223 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921226 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921228 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921231 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921233 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921236 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921238 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921241 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921243 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921246 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921248 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921252 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:33:33.923050 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921256 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921260 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921263 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921265 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921268 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921271 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921273 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921276 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921278 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921281 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921284 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921287 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921289 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921292 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921294 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921297 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921299 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921301 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921304 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921307 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:33:33.923556 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921309 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921312 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921314 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921317 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921320 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921322 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921325 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921328 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921330 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921333 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921335 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921338 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921340 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921344 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921348 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921351 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921354 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921356 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921358 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921361 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:33:33.924080 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921363 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921366 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921368 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921370 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921374 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921376 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921379 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921381 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921384 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921386 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921389 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.921391 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921493 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921511 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921519 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921523 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921529 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921533 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921537 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921542 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921545 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:33:33.924586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921548 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921552 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921555 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921558 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921561 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921564 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921567 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921570 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921573 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921576 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921580 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921583 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921586 2572 flags.go:64] FLAG: --config-dir="" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921589 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921592 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921597 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921601 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921604 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921607 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921610 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921613 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921616 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921619 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921622 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921627 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:33:33.925101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921630 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921633 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921636 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921639 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921642 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921652 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921656 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921659 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921662 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921665 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921669 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921672 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921675 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921679 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921682 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921685 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921687 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921690 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921694 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921696 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921700 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921708 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921711 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921714 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921718 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921721 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:33:33.925729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921724 2572 flags.go:64] FLAG: --help="false" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921727 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-131-22.ec2.internal" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921730 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921733 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921736 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921739 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921743 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921746 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921749 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921752 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921755 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921758 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921761 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921764 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921767 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921770 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921773 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921776 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921778 2572 flags.go:64] FLAG: --lock-file="" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921782 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921784 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921788 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921793 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:33:33.926352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921796 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921799 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921802 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921804 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921808 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921811 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921814 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921818 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921822 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921826 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921829 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921832 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921835 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921838 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921841 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921844 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921847 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921855 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921858 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921861 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921864 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921867 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921874 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921877 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:33:33.926930 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921880 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921883 2572 flags.go:64] FLAG: --port="10250" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921886 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921889 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-051ec95ca6f153784" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921893 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921896 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921899 2572 flags.go:64] FLAG: --register-node="true" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921902 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921905 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921908 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921911 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921914 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921917 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921921 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921924 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921927 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921929 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921933 2572 flags.go:64] FLAG: --runonce="false" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921936 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921939 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921942 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921945 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921948 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921952 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921955 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921958 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:33:33.927527 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921961 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921964 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921967 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921970 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921972 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921976 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921979 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921984 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921987 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921990 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921994 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.921997 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922000 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922003 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922007 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922010 2572 flags.go:64] FLAG: --v="2" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922014 2572 flags.go:64] FLAG: --version="false" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922018 2572 flags.go:64] FLAG: --vmodule="" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922022 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.922026 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922120 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922124 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922127 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922130 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:33:33.928154 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922133 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922136 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922138 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922141 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922144 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922147 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922150 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922158 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922161 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922163 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922166 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922169 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922171 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922174 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922177 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922179 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922182 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922185 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922189 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922191 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:33:33.928746 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922194 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922196 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922199 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922202 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922204 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922207 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922210 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922212 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922215 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922217 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922220 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922223 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922225 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922228 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922231 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922234 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922236 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922239 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922241 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922245 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:33:33.929283 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922247 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922250 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922252 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922255 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922258 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922260 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922263 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922266 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922268 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922271 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922275 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922278 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922280 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922283 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922285 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922288 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922290 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922293 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922296 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922298 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:33:33.929818 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922300 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922303 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922305 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922308 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922310 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922313 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922316 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922319 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922321 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922323 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922326 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922330 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922332 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922334 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922338 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922342 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922345 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922347 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922350 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:33:33.930404 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922353 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:33:33.930903 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922356 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:33:33.930903 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.922358 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:33:33.930903 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.923100 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:33:33.931235 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.931215 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:33:33.931273 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.931236 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931285 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931291 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931294 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931297 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931301 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931303 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931306 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:33:33.931306 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931308 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931311 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931314 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931317 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931320 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931322 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931326 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931329 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931331 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931334 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931336 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931339 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931342 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931345 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931347 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931350 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931353 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931355 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931358 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931361 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:33:33.931527 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931363 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931366 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931369 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931371 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931374 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931376 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931379 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931383 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931387 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931391 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931395 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931398 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931401 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931404 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931407 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931410 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931412 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931415 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:33:33.932015 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931432 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931436 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931438 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931441 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931444 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931447 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931450 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931453 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931455 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931458 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931461 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931463 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931465 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931468 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931471 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931473 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931476 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931478 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931481 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931483 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:33:33.932470 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931486 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931489 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931491 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931494 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931496 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931498 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931501 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931503 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931507 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931509 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931512 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931515 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931517 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931520 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931522 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931525 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931527 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931530 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931532 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931535 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:33:33.932973 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931538 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.931543 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931653 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931658 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931660 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931663 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931665 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931668 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931671 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931673 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931675 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931678 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931682 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931684 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931687 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931690 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:33:33.933523 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931693 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931695 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931698 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931700 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931703 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931705 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931708 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931710 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931713 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931715 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931718 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931721 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931724 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931726 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931729 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931731 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931733 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931737 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931741 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931744 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:33:33.933914 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931748 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931751 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931753 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931756 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931758 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931761 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931763 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931766 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931769 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931771 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931774 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931777 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931779 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931782 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931784 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931787 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931789 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931792 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931794 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:33:33.934383 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931797 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931799 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931801 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931806 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931809 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931813 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931816 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931819 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931822 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931825 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931827 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931830 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931833 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931836 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931839 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931841 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931844 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931847 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931850 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:33:33.934930 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931852 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931855 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931858 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931861 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931864 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931866 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931869 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931871 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931873 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931876 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931880 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931883 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931886 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:33.931888 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.931893 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:33:33.935385 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.932642 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:33:33.935777 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.935622 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:33:33.936775 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.936763 2572 server.go:1019] "Starting client certificate rotation" Apr 22 17:33:33.936876 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.936859 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:33:33.937683 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.937671 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:33:33.965165 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.965142 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:33:33.975211 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.975183 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:33:33.992328 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.992311 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:33:33.999868 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:33.999846 2572 log.go:25] "Validated CRI v1 image API" Apr 22 17:33:34.001294 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.001276 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:33:34.002051 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.002035 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:33:34.006384 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.006360 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 90eef8f7-09c8-4629-8a53-0afe88a03c6e:/dev/nvme0n1p4 d0a6ad4b-eeb5-456b-8034-57f5f4bed1e6:/dev/nvme0n1p3] Apr 22 17:33:34.006487 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.006381 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:33:34.013334 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.013208 2572 manager.go:217] Machine: {Timestamp:2026-04-22 17:33:34.010769612 +0000 UTC m=+0.478031306 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099085 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a4a90c03ff87837f823e3716c62b4 SystemUUID:ec2a4a90-c03f-f878-37f8-23e3716c62b4 BootID:d926b4f8-2ce7-4691-b155-dd70d05f6cb2 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ab:89:c2:f1:bd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ab:89:c2:f1:bd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:8e:49:c0:a9:54 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:33:34.013334 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.013326 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:33:34.013455 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.013436 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:33:34.014516 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.014488 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:33:34.014666 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.014518 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-22.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:33:34.014709 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.014675 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:33:34.014709 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.014685 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:33:34.014709 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.014698 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:33:34.015630 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.015618 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:33:34.017436 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.017411 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:33:34.017555 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.017545 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:33:34.019995 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.019983 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:33:34.020034 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.019999 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:33:34.020034 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.020011 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:33:34.020034 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.020021 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:33:34.020034 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.020030 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:33:34.021914 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.021899 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:33:34.021914 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.021917 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:33:34.026197 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.026030 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:33:34.028257 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.028235 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:33:34.030186 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030172 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030191 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030197 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030203 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030210 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030219 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030228 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030234 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030242 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030248 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030257 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:33:34.030268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.030266 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:33:34.031331 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.031320 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:33:34.031331 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.031331 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:33:34.034864 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.034829 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:33:34.034971 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.034863 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-22.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:33:34.034971 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.034940 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:33:34.035434 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.035409 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:33:34.035485 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.035463 2572 server.go:1295] "Started kubelet" Apr 22 17:33:34.035584 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.035547 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:33:34.035681 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.035599 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:33:34.035681 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.035667 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:33:34.036192 ip-10-0-131-22 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:33:34.037449 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.037413 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:33:34.038178 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.038165 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:33:34.042849 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.042828 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:33:34.042849 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.042842 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:33:34.043582 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.043565 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:33:34.043582 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.043586 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:33:34.043717 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.043564 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:33:34.043717 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.043666 2572 factory.go:55] Registering systemd factory Apr 22 17:33:34.043803 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.043726 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:33:34.043861 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.043823 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:33:34.043861 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.043834 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:33:34.044027 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.044004 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.044175 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.044162 2572 factory.go:153] Registering CRI-O factory Apr 22 17:33:34.044262 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.044180 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 17:33:34.044262 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.044229 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:33:34.044262 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.044253 2572 factory.go:103] Registering Raw factory Apr 22 17:33:34.044408 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.044268 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 17:33:34.045084 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.044651 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:33:34.045084 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.044674 2572 manager.go:319] Starting recovery of all containers Apr 22 17:33:34.045453 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.044154 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-22.ec2.internal.18a8be3b195b856b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-22.ec2.internal,UID:ip-10-0-131-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-22.ec2.internal,},FirstTimestamp:2026-04-22 17:33:34.035436907 +0000 UTC m=+0.502698600,LastTimestamp:2026-04-22 17:33:34.035436907 +0000 UTC m=+0.502698600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-22.ec2.internal,}" Apr 22 17:33:34.046612 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.046577 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5hw8k" Apr 22 17:33:34.052315 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.052285 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:33:34.052481 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.052452 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:33:34.054055 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.054033 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5hw8k" Apr 22 17:33:34.057278 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.057257 2572 manager.go:324] Recovery completed Apr 22 17:33:34.061431 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.061404 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:33:34.064653 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.064635 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:33:34.064721 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.064666 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:33:34.064721 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.064676 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:33:34.065141 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.065129 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:33:34.065194 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.065140 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:33:34.065194 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.065157 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:33:34.066612 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.066543 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-22.ec2.internal.18a8be3b1b1950ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-22.ec2.internal,UID:ip-10-0-131-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-22.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-22.ec2.internal,},FirstTimestamp:2026-04-22 17:33:34.06465249 +0000 UTC m=+0.531914182,LastTimestamp:2026-04-22 17:33:34.06465249 +0000 UTC m=+0.531914182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-22.ec2.internal,}" Apr 22 17:33:34.068908 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.068895 2572 policy_none.go:49] "None policy: Start" Apr 22 17:33:34.068963 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.068913 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:33:34.068963 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.068930 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:33:34.121995 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.121810 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123113 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.123142 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123152 2572 server.go:85] "Starting device plugin registration server" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123156 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123179 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123207 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123218 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.123311 2572 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123514 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123528 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123693 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123780 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.123792 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.124414 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.124477 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.136003 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.127248 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:33:34.224346 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.224241 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:33:34.224346 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.224240 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal"] Apr 22 17:33:34.224548 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.224372 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:33:34.225247 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.225229 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:33:34.225345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.225254 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:33:34.225345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.225266 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:33:34.225345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.225275 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:33:34.225345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.225280 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:33:34.225345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.225286 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:33:34.225345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.225316 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.227586 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.227570 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:33:34.228291 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.228270 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:33:34.228385 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.228297 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:33:34.228385 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.228311 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:33:34.228487 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.228432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.228487 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.228480 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:33:34.229138 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.229121 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:33:34.229208 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.229151 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:33:34.229208 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.229165 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:33:34.230372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.230359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.230408 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.230404 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:33:34.231144 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.231127 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:33:34.231215 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.231149 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:33:34.231215 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.231170 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:33:34.232096 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.232081 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.232146 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.232102 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-22.ec2.internal\": node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.248037 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.248012 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.254186 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.254171 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-22.ec2.internal\" not found" node="ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.258680 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.258662 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-22.ec2.internal\" not found" node="ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.344749 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.344717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43fb11857c40bae062d44a60922715d6-config\") pod \"kube-apiserver-proxy-ip-10-0-131-22.ec2.internal\" (UID: \"43fb11857c40bae062d44a60922715d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.344859 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.344753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd1d71453b132f13ea17ce4ef551d94c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal\" (UID: \"bd1d71453b132f13ea17ce4ef551d94c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.344859 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.344792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd1d71453b132f13ea17ce4ef551d94c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal\" (UID: \"bd1d71453b132f13ea17ce4ef551d94c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.348805 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.348785 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.445198 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.445159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43fb11857c40bae062d44a60922715d6-config\") pod \"kube-apiserver-proxy-ip-10-0-131-22.ec2.internal\" (UID: \"43fb11857c40bae062d44a60922715d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.445279 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.445215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd1d71453b132f13ea17ce4ef551d94c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal\" (UID: \"bd1d71453b132f13ea17ce4ef551d94c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.445279 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.445231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/43fb11857c40bae062d44a60922715d6-config\") pod \"kube-apiserver-proxy-ip-10-0-131-22.ec2.internal\" (UID: \"43fb11857c40bae062d44a60922715d6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.445279 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.445244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd1d71453b132f13ea17ce4ef551d94c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal\" (UID: \"bd1d71453b132f13ea17ce4ef551d94c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.445378 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.445286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd1d71453b132f13ea17ce4ef551d94c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal\" (UID: \"bd1d71453b132f13ea17ce4ef551d94c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.445378 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.445282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd1d71453b132f13ea17ce4ef551d94c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal\" (UID: \"bd1d71453b132f13ea17ce4ef551d94c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.449260 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.449244 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.550051 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.549988 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.557198 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.557175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.562145 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.562127 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:34.650860 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.650819 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.751308 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.751275 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.851905 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.851803 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:34.936176 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.936142 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:33:34.936752 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:34.936293 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:33:34.953034 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:34.952982 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:35.025745 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.025723 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:33:35.042970 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.042935 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:33:35.053150 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:35.053124 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:35.054094 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.054074 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:33:35.055958 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.055920 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:28:34 +0000 UTC" deadline="2028-02-05 13:34:35.896707114 +0000 UTC" Apr 22 17:33:35.055958 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.055955 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15692h1m0.840754865s" Apr 22 17:33:35.079694 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.079657 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-khcrv" Apr 22 17:33:35.088010 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.087984 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-khcrv" Apr 22 17:33:35.153593 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:35.153569 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:35.162085 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:35.162049 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fb11857c40bae062d44a60922715d6.slice/crio-0d71160549a89e6f2f9a67e7cbd4027119f4efa32600d1d679af583c89ecba51 WatchSource:0}: Error finding container 0d71160549a89e6f2f9a67e7cbd4027119f4efa32600d1d679af583c89ecba51: Status 404 returned error can't find the container with id 0d71160549a89e6f2f9a67e7cbd4027119f4efa32600d1d679af583c89ecba51 Apr 22 17:33:35.164027 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:35.163993 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1d71453b132f13ea17ce4ef551d94c.slice/crio-1b21112b8ee31da50f581e4137112e7f80a7132b7fe522cab89d80269c7ce985 WatchSource:0}: Error finding container 1b21112b8ee31da50f581e4137112e7f80a7132b7fe522cab89d80269c7ce985: Status 404 returned error can't find the container with id 1b21112b8ee31da50f581e4137112e7f80a7132b7fe522cab89d80269c7ce985 Apr 22 17:33:35.167288 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.167272 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:33:35.254001 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:35.253964 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:35.281505 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.281481 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:33:35.354900 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:35.354863 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-22.ec2.internal\" not found" Apr 22 17:33:35.416594 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.416515 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:33:35.443434 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.443396 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" Apr 22 17:33:35.453217 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.453194 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:33:35.454287 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.454275 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" Apr 22 17:33:35.471360 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:35.471337 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:33:36.021558 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.021478 2572 apiserver.go:52] "Watching apiserver" Apr 22 17:33:36.032670 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.032352 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:33:36.034622 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.034599 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-mh726","openshift-network-operator/iptables-alerter-rhnfm","openshift-cluster-node-tuning-operator/tuned-rf2jd","openshift-dns/node-resolver-lxs9q","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal","openshift-multus/multus-snt7h","openshift-ovn-kubernetes/ovnkube-node-knlln","kube-system/konnectivity-agent-6756n","kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc","openshift-image-registry/node-ca-5lt7k","openshift-multus/multus-additional-cni-plugins-rdl42","openshift-multus/network-metrics-daemon-s8svp"] Apr 22 17:33:36.037469 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.037449 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.039709 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.039681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.040947 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.040859 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:33:36.040947 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.040883 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:33:36.041169 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.041118 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:33:36.041619 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.041301 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:33:36.041619 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.041345 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:33:36.041619 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.041521 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:33:36.041619 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.041546 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wmgt6\"" Apr 22 17:33:36.041981 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.041950 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:33:36.042061 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.042008 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.042230 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.042206 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pfrnz\"" Apr 22 17:33:36.042312 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.042292 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:33:36.042466 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.042446 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:33:36.045986 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.045613 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.045986 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.045807 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:33:36.046850 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.046300 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:33:36.048443 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.048043 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9x4kg\"" Apr 22 17:33:36.049074 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.048961 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:33:36.049074 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.049036 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jcmgh\"" Apr 22 17:33:36.049246 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.049112 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.049924 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.049822 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:33:36.051831 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.051687 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:33:36.051920 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.051901 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:33:36.052114 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.052095 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:33:36.052355 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.052339 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:33:36.052651 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.052631 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9sk4w\"" Apr 22 17:33:36.053983 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.053959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-run\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.054076 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.053997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-cni-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054076 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-hostroot\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054076 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-log-socket\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054076 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-cni-bin\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054091 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-system-cni-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-cni-multus\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054137 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-kubelet\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2fec8229-262a-438e-a71a-26d8ef9fda02-ovn-node-metrics-cert\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-kubernetes\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysctl-conf\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054231 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.054261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-etc-kubernetes\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054266 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-systemd\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054316 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-multus-certs\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-iptables-alerter-script\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-sys\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-var-lib-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-node-log\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nw78\" (UniqueName: \"kubernetes.io/projected/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-kube-api-access-9nw78\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-kubelet\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-ovn\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysconfig\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-cni-bin\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-systemd-units\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysctl-d\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.054649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-systemd\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054666 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-run-ovn-kubernetes\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054689 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-ovnkube-script-lib\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-tuned\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fea00d3c-4a77-47e4-84b9-89677ae7426c-cni-binary-copy\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-slash\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-etc-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-host-slash\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-425l4\" (UniqueName: \"kubernetes.io/projected/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-kube-api-access-425l4\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-var-lib-kubelet\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-tmp\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d3515122-d7cf-41fe-855d-d19ccfe73070-hosts-file\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4cw\" (UniqueName: \"kubernetes.io/projected/d3515122-d7cf-41fe-855d-d19ccfe73070-kube-api-access-bn4cw\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.054994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-os-release\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-k8s-cni-cncf-io\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-netns\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055097 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055080 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-modprobe-d\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-lib-modules\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055129 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-host\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-socket-dir-parent\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-daemon-config\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-ovnkube-config\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txkng\" (UniqueName: \"kubernetes.io/projected/2fec8229-262a-438e-a71a-26d8ef9fda02-kube-api-access-txkng\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3515122-d7cf-41fe-855d-d19ccfe73070-tmp-dir\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-run-netns\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-env-overrides\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-conf-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmk97\" (UniqueName: \"kubernetes.io/projected/fea00d3c-4a77-47e4-84b9-89677ae7426c-kube-api-access-zmk97\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-cnibin\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.055723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.055411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-cni-netd\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.056472 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.056452 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:33:36.057368 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.056782 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:36.057368 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.056861 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:36.057368 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.056961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.058135 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.058113 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:33:36.058373 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.058346 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2jlgb\"" Apr 22 17:33:36.059015 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.058941 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:33:36.059509 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.059286 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:33:36.059509 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.059291 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wlfzg\"" Apr 22 17:33:36.059509 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.059363 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:33:36.059509 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.059215 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.061576 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.061559 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:33:36.061828 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.061812 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:33:36.062022 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.061997 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:33:36.062288 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.062271 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kc7pm\"" Apr 22 17:33:36.064446 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.064098 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.064446 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.064115 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:36.064446 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.064204 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:36.069167 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.068747 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:33:36.069167 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.068946 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:33:36.069167 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.068967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-b9z47\"" Apr 22 17:33:36.088660 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.088629 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:28:35 +0000 UTC" deadline="2027-12-30 08:23:22.916461912 +0000 UTC" Apr 22 17:33:36.088859 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.088840 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14798h49m46.827628888s" Apr 22 17:33:36.128670 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.128560 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" event={"ID":"bd1d71453b132f13ea17ce4ef551d94c","Type":"ContainerStarted","Data":"1b21112b8ee31da50f581e4137112e7f80a7132b7fe522cab89d80269c7ce985"} Apr 22 17:33:36.129616 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.129588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" event={"ID":"43fb11857c40bae062d44a60922715d6","Type":"ContainerStarted","Data":"0d71160549a89e6f2f9a67e7cbd4027119f4efa32600d1d679af583c89ecba51"} Apr 22 17:33:36.144837 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.144804 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:33:36.156131 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.155967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4cw\" (UniqueName: \"kubernetes.io/projected/d3515122-d7cf-41fe-855d-d19ccfe73070-kube-api-access-bn4cw\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.156131 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-os-release\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156131 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-os-release\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156369 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-k8s-cni-cncf-io\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156451 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-netns\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156451 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-modprobe-d\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.156451 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-lib-modules\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.156602 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-host\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.156602 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-socket-dir-parent\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156602 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-daemon-config\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156602 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-ovnkube-config\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.156602 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txkng\" (UniqueName: \"kubernetes.io/projected/2fec8229-262a-438e-a71a-26d8ef9fda02-kube-api-access-txkng\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3515122-d7cf-41fe-855d-d19ccfe73070-tmp-dir\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-run-netns\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-env-overrides\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-cnibin\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-conf-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmk97\" (UniqueName: \"kubernetes.io/projected/fea00d3c-4a77-47e4-84b9-89677ae7426c-kube-api-access-zmk97\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.156841 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-cnibin\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-cni-netd\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-run\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-sys-fs\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-cni-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.156980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-hostroot\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-log-socket\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-cni-bin\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-device-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ea00ae3-4a64-4435-be9b-6d9aec346440-host\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157123 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ct2d\" (UniqueName: \"kubernetes.io/projected/5ea00ae3-4a64-4435-be9b-6d9aec346440-kube-api-access-9ct2d\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-system-cni-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-cni-multus\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-kubelet\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2fec8229-262a-438e-a71a-26d8ef9fda02-ovn-node-metrics-cert\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-kubernetes\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.157348 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysctl-conf\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-os-release\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-etc-kubernetes\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-systemd\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157443 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157470 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ea00ae3-4a64-4435-be9b-6d9aec346440-serviceca\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-multus-certs\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-iptables-alerter-script\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-sys\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpslh\" (UniqueName: \"kubernetes.io/projected/000311a6-600b-4136-89c9-336cdc563106-kube-api-access-qpslh\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-var-lib-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-node-log\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nw78\" (UniqueName: \"kubernetes.io/projected/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-kube-api-access-9nw78\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157725 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-cni-binary-copy\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.158240 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-kubelet\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-ovn\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysconfig\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-cni-bin\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-systemd-units\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysctl-d\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.157989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-systemd\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-run-ovn-kubernetes\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-ovnkube-script-lib\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-tuned\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fea00d3c-4a77-47e4-84b9-89677ae7426c-cni-binary-copy\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-slash\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-etc-selinux\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb44w\" (UniqueName: \"kubernetes.io/projected/3c9fd5d9-260c-45fa-9866-f2e61eedd051-kube-api-access-qb44w\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-etc-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.158962 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-host-slash\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-425l4\" (UniqueName: \"kubernetes.io/projected/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-kube-api-access-425l4\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-var-lib-kubelet\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-tmp\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4c25\" (UniqueName: \"kubernetes.io/projected/9feb1c60-1e90-405e-9beb-753e0747aed0-kube-api-access-g4c25\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158468 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-socket-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-registration-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-etc-kubernetes\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-system-cni-dir\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158580 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/de2b18a3-8db8-472b-8406-2443f8f9c9b3-agent-certs\") pod \"konnectivity-agent-6756n\" (UID: \"de2b18a3-8db8-472b-8406-2443f8f9c9b3\") " pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-k8s-cni-cncf-io\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/de2b18a3-8db8-472b-8406-2443f8f9c9b3-konnectivity-ca\") pod \"konnectivity-agent-6756n\" (UID: \"de2b18a3-8db8-472b-8406-2443f8f9c9b3\") " pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-netns\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-systemd\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d3515122-d7cf-41fe-855d-d19ccfe73070-hosts-file\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.159742 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158794 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-run-multus-certs\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.158853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d3515122-d7cf-41fe-855d-d19ccfe73070-hosts-file\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-modprobe-d\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-cni-netd\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-run\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-host\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-lib-modules\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-socket-dir-parent\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-hostroot\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-cni-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-iptables-alerter-script\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-log-socket\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-cni-bin\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-sys\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-var-lib-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-system-cni-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-node-log\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-cni-multus\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-kubelet\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.159791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-daemon-config\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160162 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3515122-d7cf-41fe-855d-d19ccfe73070-tmp-dir\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-ovnkube-config\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160360 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-run-netns\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-kubelet\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160519 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-ovn\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysconfig\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-host-var-lib-cni-bin\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-host-slash\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-systemd-units\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-env-overrides\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-var-lib-kubelet\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-multus-conf-dir\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysctl-d\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-systemd\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.160849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-run-ovn-kubernetes\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.162988 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fea00d3c-4a77-47e4-84b9-89677ae7426c-cnibin\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.163810 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161056 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-host-slash\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.163810 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-kubernetes\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.163810 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2fec8229-262a-438e-a71a-26d8ef9fda02-ovnkube-script-lib\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.163810 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-etc-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.163810 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-sysctl-conf\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.163810 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fec8229-262a-438e-a71a-26d8ef9fda02-run-openvswitch\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.163810 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.161984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fea00d3c-4a77-47e4-84b9-89677ae7426c-cni-binary-copy\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.165019 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.164775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2fec8229-262a-438e-a71a-26d8ef9fda02-ovn-node-metrics-cert\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.165756 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.165576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-tmp\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.165756 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.165599 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-etc-tuned\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.168882 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.168311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4cw\" (UniqueName: \"kubernetes.io/projected/d3515122-d7cf-41fe-855d-d19ccfe73070-kube-api-access-bn4cw\") pod \"node-resolver-lxs9q\" (UID: \"d3515122-d7cf-41fe-855d-d19ccfe73070\") " pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.171971 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.171936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nw78\" (UniqueName: \"kubernetes.io/projected/dc7f3edc-3323-4fbb-8f3f-7862dfc56b51-kube-api-access-9nw78\") pod \"tuned-rf2jd\" (UID: \"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51\") " pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.171971 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.171941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-425l4\" (UniqueName: \"kubernetes.io/projected/a585179c-cf1a-4d62-9c9c-9f26a01e3c39-kube-api-access-425l4\") pod \"iptables-alerter-rhnfm\" (UID: \"a585179c-cf1a-4d62-9c9c-9f26a01e3c39\") " pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.173760 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.173693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmk97\" (UniqueName: \"kubernetes.io/projected/fea00d3c-4a77-47e4-84b9-89677ae7426c-kube-api-access-zmk97\") pod \"multus-snt7h\" (UID: \"fea00d3c-4a77-47e4-84b9-89677ae7426c\") " pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.174570 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.174549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txkng\" (UniqueName: \"kubernetes.io/projected/2fec8229-262a-438e-a71a-26d8ef9fda02-kube-api-access-txkng\") pod \"ovnkube-node-knlln\" (UID: \"2fec8229-262a-438e-a71a-26d8ef9fda02\") " pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.259354 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-registration-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.259542 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259367 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-system-cni-dir\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259542 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/de2b18a3-8db8-472b-8406-2443f8f9c9b3-agent-certs\") pod \"konnectivity-agent-6756n\" (UID: \"de2b18a3-8db8-472b-8406-2443f8f9c9b3\") " pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.259542 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/de2b18a3-8db8-472b-8406-2443f8f9c9b3-konnectivity-ca\") pod \"konnectivity-agent-6756n\" (UID: \"de2b18a3-8db8-472b-8406-2443f8f9c9b3\") " pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.259542 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-cnibin\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259542 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-registration-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.259542 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-system-cni-dir\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-cnibin\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-sys-fs\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-device-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259762 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ea00ae3-4a64-4435-be9b-6d9aec346440-host\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ct2d\" (UniqueName: \"kubernetes.io/projected/5ea00ae3-4a64-4435-be9b-6d9aec346440-kube-api-access-9ct2d\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-os-release\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-device-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.259846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ea00ae3-4a64-4435-be9b-6d9aec346440-serviceca\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpslh\" (UniqueName: \"kubernetes.io/projected/000311a6-600b-4136-89c9-336cdc563106-kube-api-access-qpslh\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-cni-binary-copy\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.259984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260065 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-etc-selinux\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/de2b18a3-8db8-472b-8406-2443f8f9c9b3-konnectivity-ca\") pod \"konnectivity-agent-6756n\" (UID: \"de2b18a3-8db8-472b-8406-2443f8f9c9b3\") " pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb44w\" (UniqueName: \"kubernetes.io/projected/3c9fd5d9-260c-45fa-9866-f2e61eedd051-kube-api-access-qb44w\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4c25\" (UniqueName: \"kubernetes.io/projected/9feb1c60-1e90-405e-9beb-753e0747aed0-kube-api-access-g4c25\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260137 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ea00ae3-4a64-4435-be9b-6d9aec346440-host\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-socket-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.260169 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.260261 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs podName:9feb1c60-1e90-405e-9beb-753e0747aed0 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:36.760229088 +0000 UTC m=+3.227490785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs") pod "network-metrics-daemon-s8svp" (UID: "9feb1c60-1e90-405e-9beb-753e0747aed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:36.260306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-socket-dir\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.261174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-etc-selinux\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.261174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260319 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/000311a6-600b-4136-89c9-336cdc563106-os-release\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.261174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.261174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-cni-binary-copy\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.261174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3c9fd5d9-260c-45fa-9866-f2e61eedd051-sys-fs\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.261174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.260870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ea00ae3-4a64-4435-be9b-6d9aec346440-serviceca\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.261174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.261124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/000311a6-600b-4136-89c9-336cdc563106-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.262600 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.262561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/de2b18a3-8db8-472b-8406-2443f8f9c9b3-agent-certs\") pod \"konnectivity-agent-6756n\" (UID: \"de2b18a3-8db8-472b-8406-2443f8f9c9b3\") " pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.267357 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.267333 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:33:36.267488 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.267363 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:33:36.267488 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.267378 2572 projected.go:194] Error preparing data for projected volume kube-api-access-m6xj8 for pod openshift-network-diagnostics/network-check-target-mh726: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:36.267488 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.267466 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8 podName:7c3eadbd-c6af-4686-bed4-c3a47b257864 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:36.767447791 +0000 UTC m=+3.234709486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m6xj8" (UniqueName: "kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8") pod "network-check-target-mh726" (UID: "7c3eadbd-c6af-4686-bed4-c3a47b257864") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:36.270054 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.269961 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpslh\" (UniqueName: \"kubernetes.io/projected/000311a6-600b-4136-89c9-336cdc563106-kube-api-access-qpslh\") pod \"multus-additional-cni-plugins-rdl42\" (UID: \"000311a6-600b-4136-89c9-336cdc563106\") " pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.270165 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.270136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4c25\" (UniqueName: \"kubernetes.io/projected/9feb1c60-1e90-405e-9beb-753e0747aed0-kube-api-access-g4c25\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:36.271379 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.271338 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ct2d\" (UniqueName: \"kubernetes.io/projected/5ea00ae3-4a64-4435-be9b-6d9aec346440-kube-api-access-9ct2d\") pod \"node-ca-5lt7k\" (UID: \"5ea00ae3-4a64-4435-be9b-6d9aec346440\") " pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.271499 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.271376 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb44w\" (UniqueName: \"kubernetes.io/projected/3c9fd5d9-260c-45fa-9866-f2e61eedd051-kube-api-access-qb44w\") pod \"aws-ebs-csi-driver-node-8l8gc\" (UID: \"3c9fd5d9-260c-45fa-9866-f2e61eedd051\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.357671 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.357576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:33:36.368638 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.368601 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rhnfm" Apr 22 17:33:36.379196 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.379164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" Apr 22 17:33:36.387925 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.387892 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lxs9q" Apr 22 17:33:36.394653 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.394624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-snt7h" Apr 22 17:33:36.402312 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.402281 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:36.409906 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.409880 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" Apr 22 17:33:36.418605 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.418570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5lt7k" Apr 22 17:33:36.425255 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.425232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rdl42" Apr 22 17:33:36.459495 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.459462 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:33:36.763412 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.763330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:36.763619 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.763489 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:36.763619 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.763563 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs podName:9feb1c60-1e90-405e-9beb-753e0747aed0 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:37.76354198 +0000 UTC m=+4.230803662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs") pod "network-metrics-daemon-s8svp" (UID: "9feb1c60-1e90-405e-9beb-753e0747aed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:36.864137 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:36.864095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:36.864314 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.864226 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:33:36.864314 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.864252 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:33:36.864314 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.864263 2572 projected.go:194] Error preparing data for projected volume kube-api-access-m6xj8 for pod openshift-network-diagnostics/network-check-target-mh726: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:36.864314 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:36.864314 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8 podName:7c3eadbd-c6af-4686-bed4-c3a47b257864 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:37.864300561 +0000 UTC m=+4.331562245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-m6xj8" (UniqueName: "kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8") pod "network-check-target-mh726" (UID: "7c3eadbd-c6af-4686-bed4-c3a47b257864") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:36.943697 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.943661 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3515122_d7cf_41fe_855d_d19ccfe73070.slice/crio-8f61181390e3b82b6bc684c9a3991c7557657392aeffe1dcbdbb1d7f0e591733 WatchSource:0}: Error finding container 8f61181390e3b82b6bc684c9a3991c7557657392aeffe1dcbdbb1d7f0e591733: Status 404 returned error can't find the container with id 8f61181390e3b82b6bc684c9a3991c7557657392aeffe1dcbdbb1d7f0e591733 Apr 22 17:33:36.945191 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.945164 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfea00d3c_4a77_47e4_84b9_89677ae7426c.slice/crio-cc716445819af9be708c812bdd6e270448a1c6b046d847bafbb2c6ff31957316 WatchSource:0}: Error finding container cc716445819af9be708c812bdd6e270448a1c6b046d847bafbb2c6ff31957316: Status 404 returned error can't find the container with id cc716445819af9be708c812bdd6e270448a1c6b046d847bafbb2c6ff31957316 Apr 22 17:33:36.946036 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.946008 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2b18a3_8db8_472b_8406_2443f8f9c9b3.slice/crio-5186fc13dcd6c91926f2ab91b6b93c15740c3b9ad6932429249c0567e2de9256 WatchSource:0}: Error finding container 5186fc13dcd6c91926f2ab91b6b93c15740c3b9ad6932429249c0567e2de9256: Status 404 returned error can't find the container with id 5186fc13dcd6c91926f2ab91b6b93c15740c3b9ad6932429249c0567e2de9256 Apr 22 17:33:36.946856 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.946835 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fec8229_262a_438e_a71a_26d8ef9fda02.slice/crio-b1ae405f5c2aeee658be8412d0a6fa9a152361484d71c7c95f3960e472463a9d WatchSource:0}: Error finding container b1ae405f5c2aeee658be8412d0a6fa9a152361484d71c7c95f3960e472463a9d: Status 404 returned error can't find the container with id b1ae405f5c2aeee658be8412d0a6fa9a152361484d71c7c95f3960e472463a9d Apr 22 17:33:36.947872 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.947847 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda585179c_cf1a_4d62_9c9c_9f26a01e3c39.slice/crio-87c7c43c4a103f5aa81c4f6a8f5cd341541c4d75b81cffdeddbd4d4dc3ff2bed WatchSource:0}: Error finding container 87c7c43c4a103f5aa81c4f6a8f5cd341541c4d75b81cffdeddbd4d4dc3ff2bed: Status 404 returned error can't find the container with id 87c7c43c4a103f5aa81c4f6a8f5cd341541c4d75b81cffdeddbd4d4dc3ff2bed Apr 22 17:33:36.949841 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.949816 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc7f3edc_3323_4fbb_8f3f_7862dfc56b51.slice/crio-7d985cc88c8e873ffa7ef55547696e9f67e7fcd17935b9baff59782366158734 WatchSource:0}: Error finding container 7d985cc88c8e873ffa7ef55547696e9f67e7fcd17935b9baff59782366158734: Status 404 returned error can't find the container with id 7d985cc88c8e873ffa7ef55547696e9f67e7fcd17935b9baff59782366158734 Apr 22 17:33:36.950739 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.950717 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c9fd5d9_260c_45fa_9866_f2e61eedd051.slice/crio-aa149d7660b00aa152cc90faf42887092d93ca8528a7029ca919d39fb87489ac WatchSource:0}: Error finding container aa149d7660b00aa152cc90faf42887092d93ca8528a7029ca919d39fb87489ac: Status 404 returned error can't find the container with id aa149d7660b00aa152cc90faf42887092d93ca8528a7029ca919d39fb87489ac Apr 22 17:33:36.972307 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.972282 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod000311a6_600b_4136_89c9_336cdc563106.slice/crio-3ae5d022a1566b1881b12a933f6366eba5d6922e0393c237c082390bda47e48e WatchSource:0}: Error finding container 3ae5d022a1566b1881b12a933f6366eba5d6922e0393c237c082390bda47e48e: Status 404 returned error can't find the container with id 3ae5d022a1566b1881b12a933f6366eba5d6922e0393c237c082390bda47e48e Apr 22 17:33:36.972729 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:33:36.972707 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea00ae3_4a64_4435_be9b_6d9aec346440.slice/crio-520ab0248b7d5d42bd5b305dd055d6506336b2c991892bf9a1f9b35f44defcd7 WatchSource:0}: Error finding container 520ab0248b7d5d42bd5b305dd055d6506336b2c991892bf9a1f9b35f44defcd7: Status 404 returned error can't find the container with id 520ab0248b7d5d42bd5b305dd055d6506336b2c991892bf9a1f9b35f44defcd7 Apr 22 17:33:37.089313 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.089116 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:28:35 +0000 UTC" deadline="2027-09-23 10:35:40.040215041 +0000 UTC" Apr 22 17:33:37.089313 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.089292 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12449h2m2.950927017s" Apr 22 17:33:37.124454 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.124410 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:37.124604 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:37.124563 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:37.132506 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.132476 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerStarted","Data":"3ae5d022a1566b1881b12a933f6366eba5d6922e0393c237c082390bda47e48e"} Apr 22 17:33:37.133471 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.133446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" event={"ID":"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51","Type":"ContainerStarted","Data":"7d985cc88c8e873ffa7ef55547696e9f67e7fcd17935b9baff59782366158734"} Apr 22 17:33:37.134346 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.134313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rhnfm" event={"ID":"a585179c-cf1a-4d62-9c9c-9f26a01e3c39","Type":"ContainerStarted","Data":"87c7c43c4a103f5aa81c4f6a8f5cd341541c4d75b81cffdeddbd4d4dc3ff2bed"} Apr 22 17:33:37.135825 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.135800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"b1ae405f5c2aeee658be8412d0a6fa9a152361484d71c7c95f3960e472463a9d"} Apr 22 17:33:37.137379 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.137353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-snt7h" event={"ID":"fea00d3c-4a77-47e4-84b9-89677ae7426c","Type":"ContainerStarted","Data":"cc716445819af9be708c812bdd6e270448a1c6b046d847bafbb2c6ff31957316"} Apr 22 17:33:37.139095 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.139075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" event={"ID":"43fb11857c40bae062d44a60922715d6","Type":"ContainerStarted","Data":"966f31ad42b49c76ebef66ddb46c9513bcbd04974ece84179cabe99d2b6c037e"} Apr 22 17:33:37.140018 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.139997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5lt7k" event={"ID":"5ea00ae3-4a64-4435-be9b-6d9aec346440","Type":"ContainerStarted","Data":"520ab0248b7d5d42bd5b305dd055d6506336b2c991892bf9a1f9b35f44defcd7"} Apr 22 17:33:37.141004 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.140978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" event={"ID":"3c9fd5d9-260c-45fa-9866-f2e61eedd051","Type":"ContainerStarted","Data":"aa149d7660b00aa152cc90faf42887092d93ca8528a7029ca919d39fb87489ac"} Apr 22 17:33:37.141936 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.141915 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6756n" event={"ID":"de2b18a3-8db8-472b-8406-2443f8f9c9b3","Type":"ContainerStarted","Data":"5186fc13dcd6c91926f2ab91b6b93c15740c3b9ad6932429249c0567e2de9256"} Apr 22 17:33:37.142808 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.142791 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lxs9q" event={"ID":"d3515122-d7cf-41fe-855d-d19ccfe73070","Type":"ContainerStarted","Data":"8f61181390e3b82b6bc684c9a3991c7557657392aeffe1dcbdbb1d7f0e591733"} Apr 22 17:33:37.151174 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.151137 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-22.ec2.internal" podStartSLOduration=2.151125935 podStartE2EDuration="2.151125935s" podCreationTimestamp="2026-04-22 17:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:33:37.151120807 +0000 UTC m=+3.618382508" watchObservedRunningTime="2026-04-22 17:33:37.151125935 +0000 UTC m=+3.618387637" Apr 22 17:33:37.773550 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.773509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:37.773732 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:37.773664 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:37.773732 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:37.773728 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs podName:9feb1c60-1e90-405e-9beb-753e0747aed0 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:39.77370932 +0000 UTC m=+6.240971020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs") pod "network-metrics-daemon-s8svp" (UID: "9feb1c60-1e90-405e-9beb-753e0747aed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:37.873908 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:37.873869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:37.874094 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:37.874060 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:33:37.874094 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:37.874080 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:33:37.874194 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:37.874096 2572 projected.go:194] Error preparing data for projected volume kube-api-access-m6xj8 for pod openshift-network-diagnostics/network-check-target-mh726: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:37.874194 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:37.874153 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8 podName:7c3eadbd-c6af-4686-bed4-c3a47b257864 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:39.874135517 +0000 UTC m=+6.341397204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-m6xj8" (UniqueName: "kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8") pod "network-check-target-mh726" (UID: "7c3eadbd-c6af-4686-bed4-c3a47b257864") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:38.124665 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:38.124629 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:38.125295 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:38.124754 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:38.165746 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:38.165707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" event={"ID":"bd1d71453b132f13ea17ce4ef551d94c","Type":"ContainerStarted","Data":"6ca7c4cd2829ce1cadc9ff93cb1599448621c9d71f21f1e282c13aae36a4ab4e"} Apr 22 17:33:39.124342 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:39.123693 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:39.124342 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:39.123837 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:39.178450 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:39.177905 2572 generic.go:358] "Generic (PLEG): container finished" podID="bd1d71453b132f13ea17ce4ef551d94c" containerID="6ca7c4cd2829ce1cadc9ff93cb1599448621c9d71f21f1e282c13aae36a4ab4e" exitCode=0 Apr 22 17:33:39.178450 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:39.177965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" event={"ID":"bd1d71453b132f13ea17ce4ef551d94c","Type":"ContainerDied","Data":"6ca7c4cd2829ce1cadc9ff93cb1599448621c9d71f21f1e282c13aae36a4ab4e"} Apr 22 17:33:39.794452 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:39.794398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:39.794660 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:39.794543 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:39.794660 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:39.794621 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs podName:9feb1c60-1e90-405e-9beb-753e0747aed0 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:43.794600752 +0000 UTC m=+10.261862450 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs") pod "network-metrics-daemon-s8svp" (UID: "9feb1c60-1e90-405e-9beb-753e0747aed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:39.895333 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:39.895294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:39.895569 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:39.895527 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:33:39.895569 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:39.895547 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:33:39.895569 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:39.895559 2572 projected.go:194] Error preparing data for projected volume kube-api-access-m6xj8 for pod openshift-network-diagnostics/network-check-target-mh726: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:39.895724 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:39.895622 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8 podName:7c3eadbd-c6af-4686-bed4-c3a47b257864 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:43.89560205 +0000 UTC m=+10.362863750 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-m6xj8" (UniqueName: "kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8") pod "network-check-target-mh726" (UID: "7c3eadbd-c6af-4686-bed4-c3a47b257864") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:40.124287 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:40.124182 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:40.124531 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:40.124297 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:41.124020 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:41.123983 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:41.124496 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:41.124133 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:42.124520 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:42.124442 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:42.124963 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:42.124573 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:43.123969 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:43.123911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:43.124183 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:43.124074 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:43.829158 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:43.829074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:43.829756 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:43.829192 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:43.829756 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:43.829257 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs podName:9feb1c60-1e90-405e-9beb-753e0747aed0 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:51.829237689 +0000 UTC m=+18.296499376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs") pod "network-metrics-daemon-s8svp" (UID: "9feb1c60-1e90-405e-9beb-753e0747aed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:43.929941 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:43.929890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:43.930116 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:43.930071 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:33:43.930116 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:43.930091 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:33:43.930116 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:43.930103 2572 projected.go:194] Error preparing data for projected volume kube-api-access-m6xj8 for pod openshift-network-diagnostics/network-check-target-mh726: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:43.930279 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:43.930159 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8 podName:7c3eadbd-c6af-4686-bed4-c3a47b257864 nodeName:}" failed. No retries permitted until 2026-04-22 17:33:51.930140955 +0000 UTC m=+18.397402656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-m6xj8" (UniqueName: "kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8") pod "network-check-target-mh726" (UID: "7c3eadbd-c6af-4686-bed4-c3a47b257864") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:44.124599 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:44.124496 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:44.124760 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:44.124651 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:45.124596 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:45.124560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:45.125263 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:45.124713 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:46.124616 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:46.124582 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:46.125078 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:46.124725 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:47.123536 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:47.123492 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:47.123701 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:47.123646 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:48.124227 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:48.124186 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:48.124712 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:48.124302 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:49.124142 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:49.124109 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:49.124314 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:49.124235 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:50.124380 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:50.124287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:50.124961 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:50.124443 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:51.123413 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:51.123365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:51.123654 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:51.123522 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:51.887862 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:51.887803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:51.888310 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:51.887941 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:51.888310 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:51.888028 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs podName:9feb1c60-1e90-405e-9beb-753e0747aed0 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:07.888007221 +0000 UTC m=+34.355268900 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs") pod "network-metrics-daemon-s8svp" (UID: "9feb1c60-1e90-405e-9beb-753e0747aed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:33:51.988589 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:51.988551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:51.988774 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:51.988695 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:33:51.988774 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:51.988709 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:33:51.988774 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:51.988718 2572 projected.go:194] Error preparing data for projected volume kube-api-access-m6xj8 for pod openshift-network-diagnostics/network-check-target-mh726: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:51.988774 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:51.988772 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8 podName:7c3eadbd-c6af-4686-bed4-c3a47b257864 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:07.988754925 +0000 UTC m=+34.456016624 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-m6xj8" (UniqueName: "kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8") pod "network-check-target-mh726" (UID: "7c3eadbd-c6af-4686-bed4-c3a47b257864") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:33:52.123878 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:52.123832 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:52.124056 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:52.123968 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:53.124092 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:53.124057 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:53.124537 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:53.124179 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:54.123966 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:54.123937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:54.124130 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:54.124020 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:55.124405 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.124094 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:55.124910 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:55.124524 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:55.221521 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.221088 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" event={"ID":"bd1d71453b132f13ea17ce4ef551d94c","Type":"ContainerStarted","Data":"3daccbf8022b0e4693b54a2f7de0845e589be27465e5dc88b9a82c0c32a7dde8"} Apr 22 17:33:55.223399 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.223363 2572 generic.go:358] "Generic (PLEG): container finished" podID="000311a6-600b-4136-89c9-336cdc563106" containerID="f656eb759e01ac640ea9140b9786f92906685be3c4292ad689955073b1ec9da9" exitCode=0 Apr 22 17:33:55.223552 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.223451 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerDied","Data":"f656eb759e01ac640ea9140b9786f92906685be3c4292ad689955073b1ec9da9"} Apr 22 17:33:55.225324 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.225131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" event={"ID":"dc7f3edc-3323-4fbb-8f3f-7862dfc56b51","Type":"ContainerStarted","Data":"3d0434c0fd92afa946fc614f37aff6653e3cfa5e87886db7ea2e2b99b653f08f"} Apr 22 17:33:55.228035 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.228008 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"cbc96ea1258d2f3be2e0b20c10a5cabd92400fd96a6cc842b9763964421fcdfd"} Apr 22 17:33:55.228143 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.228059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"9a26615afb19b7bf5b1a4282ca633890979c880dea96da1914b66aa21b5103d4"} Apr 22 17:33:55.228143 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.228077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"9dde7eee618dc90e49772f3170aa6cd28f5ebe584fe1e99113db28dbf95e1524"} Apr 22 17:33:55.228143 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.228086 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"5d30fbe1cd8b3bb22d34e1ca9eb08073b3d96113ee648facb7dcd3460eef11e3"} Apr 22 17:33:55.228143 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.228097 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"47f9bb1e4d230be7ce43880aa7e8dddb1afeca6205a72ec82dc3680ce3f4b358"} Apr 22 17:33:55.228143 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.228109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"db53041b649a93f8fc776353fc313ec28f3f82ac25f5de4aae1a67bad6d243e4"} Apr 22 17:33:55.229632 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.229608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-snt7h" event={"ID":"fea00d3c-4a77-47e4-84b9-89677ae7426c","Type":"ContainerStarted","Data":"b5716052f7289706b13aaa6bac360ff5098290b99dc6e032fac97ef40a65f913"} Apr 22 17:33:55.231098 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.231073 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5lt7k" event={"ID":"5ea00ae3-4a64-4435-be9b-6d9aec346440","Type":"ContainerStarted","Data":"fc0a7d45325f692f7e1234e40aa61b2b0ef16c883e5ca460686b94c55fe8abab"} Apr 22 17:33:55.232436 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.232386 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" event={"ID":"3c9fd5d9-260c-45fa-9866-f2e61eedd051","Type":"ContainerStarted","Data":"f97bb591b537274a38c47abbae3590c08f8c87dfa97ed79f922f133d7dba93ab"} Apr 22 17:33:55.233908 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.233882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6756n" event={"ID":"de2b18a3-8db8-472b-8406-2443f8f9c9b3","Type":"ContainerStarted","Data":"a3f5cd5716011f77edc051dc3f858ba2d050c9e83babb60bacde93f71172f51e"} Apr 22 17:33:55.234366 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.234313 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-22.ec2.internal" podStartSLOduration=20.234296681 podStartE2EDuration="20.234296681s" podCreationTimestamp="2026-04-22 17:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:33:55.233807707 +0000 UTC m=+21.701069410" watchObservedRunningTime="2026-04-22 17:33:55.234296681 +0000 UTC m=+21.701558384" Apr 22 17:33:55.235312 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.235289 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lxs9q" event={"ID":"d3515122-d7cf-41fe-855d-d19ccfe73070","Type":"ContainerStarted","Data":"7fc7f119b11463ae4a0f47a126e4df854f657aa2415d33087ef9714ee80b05f1"} Apr 22 17:33:55.246487 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.246402 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rf2jd" podStartSLOduration=3.9603719870000003 podStartE2EDuration="21.246385604s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.970791598 +0000 UTC m=+3.438053279" lastFinishedPulling="2026-04-22 17:33:54.256805202 +0000 UTC m=+20.724066896" observedRunningTime="2026-04-22 17:33:55.246326883 +0000 UTC m=+21.713588620" watchObservedRunningTime="2026-04-22 17:33:55.246385604 +0000 UTC m=+21.713647308" Apr 22 17:33:55.257702 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.257656 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5lt7k" podStartSLOduration=8.765954654 podStartE2EDuration="21.257642844s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.974881923 +0000 UTC m=+3.442143603" lastFinishedPulling="2026-04-22 17:33:49.46657011 +0000 UTC m=+15.933831793" observedRunningTime="2026-04-22 17:33:55.257511944 +0000 UTC m=+21.724773669" watchObservedRunningTime="2026-04-22 17:33:55.257642844 +0000 UTC m=+21.724904545" Apr 22 17:33:55.297435 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.297359 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-snt7h" podStartSLOduration=3.949045917 podStartE2EDuration="21.297341318s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.947379805 +0000 UTC m=+3.414641499" lastFinishedPulling="2026-04-22 17:33:54.295675205 +0000 UTC m=+20.762936900" observedRunningTime="2026-04-22 17:33:55.29682934 +0000 UTC m=+21.764091033" watchObservedRunningTime="2026-04-22 17:33:55.297341318 +0000 UTC m=+21.764603022" Apr 22 17:33:55.311707 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.311657 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lxs9q" podStartSLOduration=4.002054374 podStartE2EDuration="21.311642152s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.945382987 +0000 UTC m=+3.412644668" lastFinishedPulling="2026-04-22 17:33:54.254970766 +0000 UTC m=+20.722232446" observedRunningTime="2026-04-22 17:33:55.310953591 +0000 UTC m=+21.778215305" watchObservedRunningTime="2026-04-22 17:33:55.311642152 +0000 UTC m=+21.778903854" Apr 22 17:33:55.325362 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.325312 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6756n" podStartSLOduration=4.038329688 podStartE2EDuration="21.325299913s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.947845914 +0000 UTC m=+3.415107595" lastFinishedPulling="2026-04-22 17:33:54.234816125 +0000 UTC m=+20.702077820" observedRunningTime="2026-04-22 17:33:55.325097883 +0000 UTC m=+21.792359589" watchObservedRunningTime="2026-04-22 17:33:55.325299913 +0000 UTC m=+21.792561615" Apr 22 17:33:55.627685 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:55.627648 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:33:56.123652 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:56.123585 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:56.123940 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:56.123717 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:56.132339 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:56.132250 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:33:55.627668286Z","UUID":"a058e7f0-75c9-4d60-be3f-fa23d159853e","Handler":null,"Name":"","Endpoint":""} Apr 22 17:33:56.134645 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:56.134618 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:33:56.134645 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:56.134652 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:33:56.239411 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:56.239301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rhnfm" event={"ID":"a585179c-cf1a-4d62-9c9c-9f26a01e3c39","Type":"ContainerStarted","Data":"64fdbe7a48890d8fe5a5b22ae5eafcd0dd92d851548523fbbc9046b8b6d7b981"} Apr 22 17:33:56.241391 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:56.241324 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" event={"ID":"3c9fd5d9-260c-45fa-9866-f2e61eedd051","Type":"ContainerStarted","Data":"c9c8335dd126c9e9b25a558896f23f2122c83b123f226043d08b2200391de7c5"} Apr 22 17:33:57.123449 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:57.123403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:57.123615 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:57.123557 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:57.248037 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:57.247925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"93779c738cdc574e4ce29d0a4c98c776e1a7ddb7d3dc57e27db36e866a346b5f"} Apr 22 17:33:57.250245 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:57.250192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" event={"ID":"3c9fd5d9-260c-45fa-9866-f2e61eedd051","Type":"ContainerStarted","Data":"0d33b6af5509e7790455295cd17fa8636487fd8cc8684dd7ca466cb13a829f00"} Apr 22 17:33:57.268410 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:57.268361 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8l8gc" podStartSLOduration=3.6659314739999997 podStartE2EDuration="23.268345392s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.970788066 +0000 UTC m=+3.438049749" lastFinishedPulling="2026-04-22 17:33:56.573201969 +0000 UTC m=+23.040463667" observedRunningTime="2026-04-22 17:33:57.268016886 +0000 UTC m=+23.735278588" watchObservedRunningTime="2026-04-22 17:33:57.268345392 +0000 UTC m=+23.735607093" Apr 22 17:33:57.268682 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:57.268652 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rhnfm" podStartSLOduration=5.9632797570000005 podStartE2EDuration="23.268645135s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.949680924 +0000 UTC m=+3.416942619" lastFinishedPulling="2026-04-22 17:33:54.255046307 +0000 UTC m=+20.722307997" observedRunningTime="2026-04-22 17:33:56.263630316 +0000 UTC m=+22.730892019" watchObservedRunningTime="2026-04-22 17:33:57.268645135 +0000 UTC m=+23.735906840" Apr 22 17:33:58.123545 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:58.123511 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:33:58.123722 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:58.123649 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:33:59.123968 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:59.123935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:33:59.124465 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:33:59.124072 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:33:59.501913 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:59.501734 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6756n" Apr 22 17:33:59.502379 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:33:59.502359 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6756n" Apr 22 17:34:00.123907 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.123824 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:00.124042 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:00.123929 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:34:00.258051 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.258010 2572 generic.go:358] "Generic (PLEG): container finished" podID="000311a6-600b-4136-89c9-336cdc563106" containerID="8471fa6b71c7262f44f6d79f629e55b104ef9b459615f749aa676e05423b5a4f" exitCode=0 Apr 22 17:34:00.258206 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.258057 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerDied","Data":"8471fa6b71c7262f44f6d79f629e55b104ef9b459615f749aa676e05423b5a4f"} Apr 22 17:34:00.261404 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.261377 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" event={"ID":"2fec8229-262a-438e-a71a-26d8ef9fda02","Type":"ContainerStarted","Data":"b6a125a11393b45fe87534eb0ab34b1efd36a08457b3b87d1bb955d23ecb2750"} Apr 22 17:34:00.261625 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.261609 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6756n" Apr 22 17:34:00.261959 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.261913 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:34:00.262415 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.262402 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6756n" Apr 22 17:34:00.277347 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.277323 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:34:00.304606 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:00.304560 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" podStartSLOduration=8.53442916 podStartE2EDuration="26.30454562s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.949404183 +0000 UTC m=+3.416665876" lastFinishedPulling="2026-04-22 17:33:54.719520637 +0000 UTC m=+21.186782336" observedRunningTime="2026-04-22 17:34:00.303283419 +0000 UTC m=+26.770545120" watchObservedRunningTime="2026-04-22 17:34:00.30454562 +0000 UTC m=+26.771807336" Apr 22 17:34:01.123920 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.123740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:01.124097 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:01.123999 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:34:01.230278 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.230191 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mh726"] Apr 22 17:34:01.230458 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.230289 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:01.230458 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:01.230376 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:34:01.233043 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.233019 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s8svp"] Apr 22 17:34:01.265722 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.265693 2572 generic.go:358] "Generic (PLEG): container finished" podID="000311a6-600b-4136-89c9-336cdc563106" containerID="f4268eb13d84967d41742d544db444f4caa1a1c1b97d0650a24450d2ea933c6c" exitCode=0 Apr 22 17:34:01.265875 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.265801 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerDied","Data":"f4268eb13d84967d41742d544db444f4caa1a1c1b97d0650a24450d2ea933c6c"} Apr 22 17:34:01.265981 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.265962 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:01.266589 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:01.266087 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:34:01.267138 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.266699 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:34:01.267138 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.266740 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:34:01.281941 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:01.281918 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:34:02.269570 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:02.269540 2572 generic.go:358] "Generic (PLEG): container finished" podID="000311a6-600b-4136-89c9-336cdc563106" containerID="1b5b1d499af3e64e9cc4df1cbd218f7c58eaedef00e58bc331ca6ad8bc9cf102" exitCode=0 Apr 22 17:34:02.269972 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:02.269627 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerDied","Data":"1b5b1d499af3e64e9cc4df1cbd218f7c58eaedef00e58bc331ca6ad8bc9cf102"} Apr 22 17:34:03.123972 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:03.123935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:03.124214 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:03.123935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:03.124214 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:03.124053 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:34:03.124214 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:03.124168 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:34:05.123892 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:05.123855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:05.124734 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:05.123868 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:05.124734 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:05.123983 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:34:05.124734 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:05.124093 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:34:07.123791 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.123573 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:07.124255 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.123648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:07.124255 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.123874 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mh726" podUID="7c3eadbd-c6af-4686-bed4-c3a47b257864" Apr 22 17:34:07.124255 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.123961 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s8svp" podUID="9feb1c60-1e90-405e-9beb-753e0747aed0" Apr 22 17:34:07.373334 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.373212 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-22.ec2.internal" event="NodeReady" Apr 22 17:34:07.373511 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.373350 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:34:07.404091 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.404052 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw"] Apr 22 17:34:07.433172 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.433103 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5944f7956-vsk55"] Apr 22 17:34:07.433544 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.433347 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" Apr 22 17:34:07.436720 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.436692 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.436935 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.436724 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.436935 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.436726 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mfczx\"" Apr 22 17:34:07.448571 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.448541 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69c75ddc67-c9dl5"] Apr 22 17:34:07.448879 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.448856 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.451341 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.451316 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mmlbf\"" Apr 22 17:34:07.451686 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.451656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:34:07.451871 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.451690 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:34:07.451971 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.451855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:34:07.459374 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.459068 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:34:07.465630 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.465605 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-9d9fb4b58-dpchv"] Apr 22 17:34:07.465782 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.465761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.484615 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.484579 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vvmj2"] Apr 22 17:34:07.484763 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.484748 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.487350 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.487327 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.487513 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.487493 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 17:34:07.487592 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.487499 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 17:34:07.487592 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.487565 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 17:34:07.487693 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.487620 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nn5dh\"" Apr 22 17:34:07.487922 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.487901 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.487922 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.487909 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 17:34:07.501286 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.501253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stksb\" (UniqueName: \"kubernetes.io/projected/8b7e852e-32bb-4bbe-be45-374b4376ee6d-kube-api-access-stksb\") pod \"volume-data-source-validator-7c6cbb6c87-v8lmw\" (UID: \"8b7e852e-32bb-4bbe-be45-374b4376ee6d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" Apr 22 17:34:07.503800 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.503777 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7rk27"] Apr 22 17:34:07.503981 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.503962 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.506474 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.506449 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 17:34:07.506604 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.506497 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.506604 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.506538 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.506604 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.506567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-pfk5z\"" Apr 22 17:34:07.506781 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.506649 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 17:34:07.510809 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.510655 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 17:34:07.524851 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.524827 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd"] Apr 22 17:34:07.525000 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.524985 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.527737 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.527709 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 17:34:07.527850 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.527756 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.527850 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.527823 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 17:34:07.528044 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.528025 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gj6pw\"" Apr 22 17:34:07.528373 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.528353 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.534886 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.534867 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 17:34:07.537074 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.537054 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7"] Apr 22 17:34:07.537225 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.537195 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" Apr 22 17:34:07.539864 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.539842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-ll24v\"" Apr 22 17:34:07.539983 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.539876 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.540117 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.540100 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.558121 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.558083 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw"] Apr 22 17:34:07.558290 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.558270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:07.560518 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.560486 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.560821 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.560643 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.560821 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.560707 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 17:34:07.560821 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.560770 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-n9rkg\"" Apr 22 17:34:07.573035 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.573006 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9"] Apr 22 17:34:07.573171 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.573155 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.575573 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.575549 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 17:34:07.575744 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.575580 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.575856 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.575594 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.575856 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.575665 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-b6264\"" Apr 22 17:34:07.576676 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.576658 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 17:34:07.592491 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.592464 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9"] Apr 22 17:34:07.592671 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.592648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:07.595127 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.595105 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 17:34:07.595251 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.595205 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 17:34:07.595317 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.595284 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rxzrr\"" Apr 22 17:34:07.601941 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.601738 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stksb\" (UniqueName: \"kubernetes.io/projected/8b7e852e-32bb-4bbe-be45-374b4376ee6d-kube-api-access-stksb\") pod \"volume-data-source-validator-7c6cbb6c87-v8lmw\" (UID: \"8b7e852e-32bb-4bbe-be45-374b4376ee6d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" Apr 22 17:34:07.602064 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.601990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfe6b31-c032-43ba-be07-caa12af15041-serving-cert\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.602064 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xshvq\" (UniqueName: \"kubernetes.io/projected/17af67ec-8577-45de-abbb-01a7199ee7cd-kube-api-access-xshvq\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.602179 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-image-registry-private-configuration\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.602179 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m9r9\" (UniqueName: \"kubernetes.io/projected/2dd100f6-0060-426d-9cf7-a7f9fafa003a-kube-api-access-5m9r9\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.602281 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-default-certificate\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.602281 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-trusted-ca\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.602281 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-trusted-ca\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.602449 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.602449 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84e06372-35da-4f2d-84a9-4f0537970fba-ca-trust-extracted\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.602449 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfe6b31-c032-43ba-be07-caa12af15041-trusted-ca\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.602449 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-bound-sa-token\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.602650 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.602650 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-stats-auth\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.602650 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrzmm\" (UniqueName: \"kubernetes.io/projected/7bfe6b31-c032-43ba-be07-caa12af15041-kube-api-access-mrzmm\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.602650 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-registry-certificates\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.602650 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfe6b31-c032-43ba-be07-caa12af15041-config\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.602650 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602657 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-installation-pull-secrets\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-installation-pull-secrets\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602765 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtclj\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-kube-api-access-jtclj\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr9rz\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-kube-api-access-xr9rz\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-image-registry-private-configuration\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd100f6-0060-426d-9cf7-a7f9fafa003a-service-ca-bundle\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.602919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602916 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e11e0d-c49c-4634-b336-26f608c0be83-ca-trust-extracted\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.603285 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-registry-certificates\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.603285 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602967 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2dd100f6-0060-426d-9cf7-a7f9fafa003a-tmp\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.603285 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.602996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcp9q\" (UniqueName: \"kubernetes.io/projected/daeace97-112e-453b-ae2c-bd7b73b63cc1-kube-api-access-lcp9q\") pod \"network-check-source-8894fc9bd-2qtfd\" (UID: \"daeace97-112e-453b-ae2c-bd7b73b63cc1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" Apr 22 17:34:07.603285 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.603027 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd100f6-0060-426d-9cf7-a7f9fafa003a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.603285 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.603055 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-bound-sa-token\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.603285 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.603085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2dd100f6-0060-426d-9cf7-a7f9fafa003a-snapshots\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.603285 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.603158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd100f6-0060-426d-9cf7-a7f9fafa003a-serving-cert\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.607523 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.607499 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb"] Apr 22 17:34:07.607642 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.607633 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.612016 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.611996 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 17:34:07.612145 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.612040 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 17:34:07.612145 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.612115 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.612288 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.612267 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.612338 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.612317 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-mmw4b\"" Apr 22 17:34:07.621938 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.621902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stksb\" (UniqueName: \"kubernetes.io/projected/8b7e852e-32bb-4bbe-be45-374b4376ee6d-kube-api-access-stksb\") pod \"volume-data-source-validator-7c6cbb6c87-v8lmw\" (UID: \"8b7e852e-32bb-4bbe-be45-374b4376ee6d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" Apr 22 17:34:07.622893 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.622874 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw"] Apr 22 17:34:07.622987 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.622900 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vvmj2"] Apr 22 17:34:07.622987 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.622914 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s28sr"] Apr 22 17:34:07.623082 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.623000 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.625105 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.625050 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 17:34:07.625105 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.625074 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-424zf\"" Apr 22 17:34:07.625252 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.625112 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.625252 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.625176 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 17:34:07.625406 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.625393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.644922 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.644889 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5944f7956-vsk55"] Apr 22 17:34:07.644922 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.644927 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw"] Apr 22 17:34:07.645114 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.644942 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69c75ddc67-c9dl5"] Apr 22 17:34:07.645114 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:07.645196 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645054 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb"] Apr 22 17:34:07.645196 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645157 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9"] Apr 22 17:34:07.645196 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645177 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-9d9fb4b58-dpchv"] Apr 22 17:34:07.645196 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645189 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7rk27"] Apr 22 17:34:07.645372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645204 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9"] Apr 22 17:34:07.645372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645217 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7"] Apr 22 17:34:07.645372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645230 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd"] Apr 22 17:34:07.645372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645241 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s28sr"] Apr 22 17:34:07.645372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.645267 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vj4xl"] Apr 22 17:34:07.648036 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.647918 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:34:07.648036 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.647946 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-k4t5h\"" Apr 22 17:34:07.648036 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.647966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:34:07.648238 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.648172 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:34:07.657654 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.657632 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vj4xl"] Apr 22 17:34:07.657779 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.657762 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.660037 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.660018 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:34:07.660148 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.660044 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ghfnt\"" Apr 22 17:34:07.660148 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.660065 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:34:07.704374 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfe6b31-c032-43ba-be07-caa12af15041-serving-cert\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.704579 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xshvq\" (UniqueName: \"kubernetes.io/projected/17af67ec-8577-45de-abbb-01a7199ee7cd-kube-api-access-xshvq\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.704579 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7526861b-0afa-4db9-9077-282b6ab524f3-config\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.704579 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-image-registry-private-configuration\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.704579 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346331ec-1cea-46ea-8952-6af403c257c0-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.704579 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m9r9\" (UniqueName: \"kubernetes.io/projected/2dd100f6-0060-426d-9cf7-a7f9fafa003a-kube-api-access-5m9r9\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.704579 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-default-certificate\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7526861b-0afa-4db9-9077-282b6ab524f3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zwxp\" (UniqueName: \"kubernetes.io/projected/346331ec-1cea-46ea-8952-6af403c257c0-kube-api-access-9zwxp\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdzml\" (UniqueName: \"kubernetes.io/projected/e9ca84a4-7331-4976-9476-b7842a9814e3-kube-api-access-wdzml\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-trusted-ca\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-trusted-ca\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84e06372-35da-4f2d-84a9-4f0537970fba-ca-trust-extracted\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346331ec-1cea-46ea-8952-6af403c257c0-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.704863 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfe6b31-c032-43ba-be07-caa12af15041-trusted-ca\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-bound-sa-token\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-stats-auth\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrzmm\" (UniqueName: \"kubernetes.io/projected/7bfe6b31-c032-43ba-be07-caa12af15041-kube-api-access-mrzmm\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.704974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-registry-certificates\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfe6b31-c032-43ba-be07-caa12af15041-config\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-installation-pull-secrets\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705096 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705134 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwlr\" (UniqueName: \"kubernetes.io/projected/7526861b-0afa-4db9-9077-282b6ab524f3-kube-api-access-npwlr\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-installation-pull-secrets\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtclj\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-kube-api-access-jtclj\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.705292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr9rz\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-kube-api-access-xr9rz\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e9ca84a4-7331-4976-9476-b7842a9814e3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-image-registry-private-configuration\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd100f6-0060-426d-9cf7-a7f9fafa003a-service-ca-bundle\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e11e0d-c49c-4634-b336-26f608c0be83-ca-trust-extracted\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-registry-certificates\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2dd100f6-0060-426d-9cf7-a7f9fafa003a-tmp\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsg5q\" (UniqueName: \"kubernetes.io/projected/001a3e34-ff95-45f5-a62e-b8389b0e0df0-kube-api-access-qsg5q\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcp9q\" (UniqueName: \"kubernetes.io/projected/daeace97-112e-453b-ae2c-bd7b73b63cc1-kube-api-access-lcp9q\") pod \"network-check-source-8894fc9bd-2qtfd\" (UID: \"daeace97-112e-453b-ae2c-bd7b73b63cc1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd100f6-0060-426d-9cf7-a7f9fafa003a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.705536 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.205515036 +0000 UTC m=+34.672776739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : configmap references non-existent config key: service-ca.crt Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705566 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-bound-sa-token\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705603 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2dd100f6-0060-426d-9cf7-a7f9fafa003a-snapshots\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd100f6-0060-426d-9cf7-a7f9fafa003a-serving-cert\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.706130 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.705846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfe6b31-c032-43ba-be07-caa12af15041-trusted-ca\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.706033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfe6b31-c032-43ba-be07-caa12af15041-config\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.706214 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.706269 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.206249803 +0000 UTC m=+34.673511504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : secret "router-metrics-certs-default" not found Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.706375 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.706413 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5944f7956-vsk55: secret "image-registry-tls" not found Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.706495 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls podName:84e06372-35da-4f2d-84a9-4f0537970fba nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.206477259 +0000 UTC m=+34.673738939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls") pod "image-registry-5944f7956-vsk55" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba") : secret "image-registry-tls" not found Apr 22 17:34:07.706909 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.706663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd100f6-0060-426d-9cf7-a7f9fafa003a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.708292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.706934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-trusted-ca\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.708292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.707165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-registry-certificates\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.708292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.707825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-registry-certificates\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.708292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.708039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-trusted-ca\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.708292 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.708037 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:07.708292 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.708070 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c75ddc67-c9dl5: secret "image-registry-tls" not found Apr 22 17:34:07.710005 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.709976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd100f6-0060-426d-9cf7-a7f9fafa003a-service-ca-bundle\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.710291 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.710264 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e11e0d-c49c-4634-b336-26f608c0be83-ca-trust-extracted\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.710291 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.710266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2dd100f6-0060-426d-9cf7-a7f9fafa003a-snapshots\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.710576 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.710552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-installation-pull-secrets\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.710736 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.710708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84e06372-35da-4f2d-84a9-4f0537970fba-ca-trust-extracted\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.710866 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.710812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd100f6-0060-426d-9cf7-a7f9fafa003a-serving-cert\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.713522 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.711005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2dd100f6-0060-426d-9cf7-a7f9fafa003a-tmp\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.713522 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.711055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-default-certificate\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.713522 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.711220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-stats-auth\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.713522 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.711329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfe6b31-c032-43ba-be07-caa12af15041-serving-cert\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.713522 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.711361 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls podName:31e11e0d-c49c-4634-b336-26f608c0be83 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.208107088 +0000 UTC m=+34.675368791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls") pod "image-registry-69c75ddc67-c9dl5" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83") : secret "image-registry-tls" not found Apr 22 17:34:07.713522 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.713436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-installation-pull-secrets\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.713875 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.711410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-image-registry-private-configuration\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.714105 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.714018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-image-registry-private-configuration\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.717478 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.717375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m9r9\" (UniqueName: \"kubernetes.io/projected/2dd100f6-0060-426d-9cf7-a7f9fafa003a-kube-api-access-5m9r9\") pod \"insights-operator-585dfdc468-7rk27\" (UID: \"2dd100f6-0060-426d-9cf7-a7f9fafa003a\") " pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.718004 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.717715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrzmm\" (UniqueName: \"kubernetes.io/projected/7bfe6b31-c032-43ba-be07-caa12af15041-kube-api-access-mrzmm\") pod \"console-operator-9d4b6777b-vvmj2\" (UID: \"7bfe6b31-c032-43ba-be07-caa12af15041\") " pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.718004 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.717937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xshvq\" (UniqueName: \"kubernetes.io/projected/17af67ec-8577-45de-abbb-01a7199ee7cd-kube-api-access-xshvq\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:07.718004 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.717958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-bound-sa-token\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.719562 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.719538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtclj\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-kube-api-access-jtclj\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.721008 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.720966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-bound-sa-token\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:07.721008 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.721000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcp9q\" (UniqueName: \"kubernetes.io/projected/daeace97-112e-453b-ae2c-bd7b73b63cc1-kube-api-access-lcp9q\") pod \"network-check-source-8894fc9bd-2qtfd\" (UID: \"daeace97-112e-453b-ae2c-bd7b73b63cc1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" Apr 22 17:34:07.725396 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.725371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr9rz\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-kube-api-access-xr9rz\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:07.751521 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.751479 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" Apr 22 17:34:07.807117 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:07.807319 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807129 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-config-volume\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.807319 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:07.807319 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:07.807319 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7526861b-0afa-4db9-9077-282b6ab524f3-config\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.807554 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.807322 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:07.807554 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346331ec-1cea-46ea-8952-6af403c257c0-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.807554 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.807434 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert podName:59ff7442-70a1-4df1-a3a2-9eff7d027d6e nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.307394378 +0000 UTC m=+34.774656075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pcpz9" (UID: "59ff7442-70a1-4df1-a3a2-9eff7d027d6e") : secret "networking-console-plugin-cert" not found Apr 22 17:34:07.807554 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7526861b-0afa-4db9-9077-282b6ab524f3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.807554 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zwxp\" (UniqueName: \"kubernetes.io/projected/346331ec-1cea-46ea-8952-6af403c257c0-kube-api-access-9zwxp\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.807554 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdzml\" (UniqueName: \"kubernetes.io/projected/e9ca84a4-7331-4976-9476-b7842a9814e3-kube-api-access-wdzml\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.807809 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807585 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-tmp-dir\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.807809 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807666 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9p6l\" (UniqueName: \"kubernetes.io/projected/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-kube-api-access-n9p6l\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.807809 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346331ec-1cea-46ea-8952-6af403c257c0-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.807809 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:07.807809 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npwlr\" (UniqueName: \"kubernetes.io/projected/7526861b-0afa-4db9-9077-282b6ab524f3-kube-api-access-npwlr\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.807809 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e9ca84a4-7331-4976-9476-b7842a9814e3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mplk\" (UniqueName: \"kubernetes.io/projected/dabb0188-2e22-40cb-b765-c6e0a5a0b030-kube-api-access-7mplk\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346331ec-1cea-46ea-8952-6af403c257c0-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.807935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsg5q\" (UniqueName: \"kubernetes.io/projected/001a3e34-ff95-45f5-a62e-b8389b0e0df0-kube-api-access-qsg5q\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.807988 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.807996 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.808036 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls podName:e9ca84a4-7331-4976-9476-b7842a9814e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.308024099 +0000 UTC m=+34.775285793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vpqhw" (UID: "e9ca84a4-7331-4976-9476-b7842a9814e3") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:07.808103 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.808056 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls podName:001a3e34-ff95-45f5-a62e-b8389b0e0df0 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.30804154 +0000 UTC m=+34.775303221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp6f7" (UID: "001a3e34-ff95-45f5-a62e-b8389b0e0df0") : secret "samples-operator-tls" not found Apr 22 17:34:07.808712 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.808690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e9ca84a4-7331-4976-9476-b7842a9814e3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.810249 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.810229 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346331ec-1cea-46ea-8952-6af403c257c0-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.810351 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.810259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7526861b-0afa-4db9-9077-282b6ab524f3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.814499 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.814405 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:07.817075 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.817030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsg5q\" (UniqueName: \"kubernetes.io/projected/001a3e34-ff95-45f5-a62e-b8389b0e0df0-kube-api-access-qsg5q\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:07.817294 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.817246 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zwxp\" (UniqueName: \"kubernetes.io/projected/346331ec-1cea-46ea-8952-6af403c257c0-kube-api-access-9zwxp\") pod \"kube-storage-version-migrator-operator-6769c5d45-6bzf9\" (UID: \"346331ec-1cea-46ea-8952-6af403c257c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.817482 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.817461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwlr\" (UniqueName: \"kubernetes.io/projected/7526861b-0afa-4db9-9077-282b6ab524f3-kube-api-access-npwlr\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.817779 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.817750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdzml\" (UniqueName: \"kubernetes.io/projected/e9ca84a4-7331-4976-9476-b7842a9814e3-kube-api-access-wdzml\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:07.817913 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.817892 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7526861b-0afa-4db9-9077-282b6ab524f3-config\") pod \"service-ca-operator-d6fc45fc5-8cqtb\" (UID: \"7526861b-0afa-4db9-9077-282b6ab524f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:07.835684 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.835640 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7rk27" Apr 22 17:34:07.847569 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.847535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" Apr 22 17:34:07.909005 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.908973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:07.909168 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-tmp-dir\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.909168 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9p6l\" (UniqueName: \"kubernetes.io/projected/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-kube-api-access-n9p6l\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.909268 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.909158 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:07.909268 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.909268 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.909235 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert podName:dabb0188-2e22-40cb-b765-c6e0a5a0b030 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.409215166 +0000 UTC m=+34.876476867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert") pod "ingress-canary-s28sr" (UID: "dabb0188-2e22-40cb-b765-c6e0a5a0b030") : secret "canary-serving-cert" not found Apr 22 17:34:07.909268 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.909255 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:07.909456 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.909297 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls podName:b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.4092835 +0000 UTC m=+34.876545180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls") pod "dns-default-vj4xl" (UID: "b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a") : secret "dns-default-metrics-tls" not found Apr 22 17:34:07.909456 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:07.909456 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mplk\" (UniqueName: \"kubernetes.io/projected/dabb0188-2e22-40cb-b765-c6e0a5a0b030-kube-api-access-7mplk\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:07.909456 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-config-volume\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.909456 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.909443 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:07.909456 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-tmp-dir\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.909657 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:07.909510 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs podName:9feb1c60-1e90-405e-9beb-753e0747aed0 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.909482415 +0000 UTC m=+66.376744109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs") pod "network-metrics-daemon-s8svp" (UID: "9feb1c60-1e90-405e-9beb-753e0747aed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:07.909850 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.909832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-config-volume\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.919706 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.919678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mplk\" (UniqueName: \"kubernetes.io/projected/dabb0188-2e22-40cb-b765-c6e0a5a0b030-kube-api-access-7mplk\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:07.920372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.920342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9p6l\" (UniqueName: \"kubernetes.io/projected/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-kube-api-access-n9p6l\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:07.922371 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.922348 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" Apr 22 17:34:07.932364 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:07.932054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" Apr 22 17:34:08.011442 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.010883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:08.025239 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.025158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6xj8\" (UniqueName: \"kubernetes.io/projected/7c3eadbd-c6af-4686-bed4-c3a47b257864-kube-api-access-m6xj8\") pod \"network-check-target-mh726\" (UID: \"7c3eadbd-c6af-4686-bed4-c3a47b257864\") " pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:08.130220 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.129643 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw"] Apr 22 17:34:08.133286 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.133259 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd"] Apr 22 17:34:08.139544 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.139513 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vvmj2"] Apr 22 17:34:08.153345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.153321 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7rk27"] Apr 22 17:34:08.155340 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:08.155307 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7e852e_32bb_4bbe_be45_374b4376ee6d.slice/crio-9a2548105ff9a6756de2edc8108affa5b1c148081e286d316b940da2378151ab WatchSource:0}: Error finding container 9a2548105ff9a6756de2edc8108affa5b1c148081e286d316b940da2378151ab: Status 404 returned error can't find the container with id 9a2548105ff9a6756de2edc8108affa5b1c148081e286d316b940da2378151ab Apr 22 17:34:08.157115 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.157093 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9"] Apr 22 17:34:08.158534 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.158394 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb"] Apr 22 17:34:08.162799 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:08.162762 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod346331ec_1cea_46ea_8952_6af403c257c0.slice/crio-2d69139e3a15e9da6ab52be92f87a48948e40aedcdddbdadb9ece6886cf602ec WatchSource:0}: Error finding container 2d69139e3a15e9da6ab52be92f87a48948e40aedcdddbdadb9ece6886cf602ec: Status 404 returned error can't find the container with id 2d69139e3a15e9da6ab52be92f87a48948e40aedcdddbdadb9ece6886cf602ec Apr 22 17:34:08.163021 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:08.162952 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7526861b_0afa_4db9_9077_282b6ab524f3.slice/crio-8cceb4c47fe8d65a891703905fdb61586af842e1ff3b2732e31c2eb5825ee485 WatchSource:0}: Error finding container 8cceb4c47fe8d65a891703905fdb61586af842e1ff3b2732e31c2eb5825ee485: Status 404 returned error can't find the container with id 8cceb4c47fe8d65a891703905fdb61586af842e1ff3b2732e31c2eb5825ee485 Apr 22 17:34:08.212967 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.212945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:08.213047 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.213000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:08.213097 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213076 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:08.213097 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213088 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c75ddc67-c9dl5: secret "image-registry-tls" not found Apr 22 17:34:08.213191 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213097 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:08.213191 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213116 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5944f7956-vsk55: secret "image-registry-tls" not found Apr 22 17:34:08.213191 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213142 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls podName:31e11e0d-c49c-4634-b336-26f608c0be83 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.213125114 +0000 UTC m=+35.680386809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls") pod "image-registry-69c75ddc67-c9dl5" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83") : secret "image-registry-tls" not found Apr 22 17:34:08.213191 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213160 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls podName:84e06372-35da-4f2d-84a9-4f0537970fba nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.213153215 +0000 UTC m=+35.680414905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls") pod "image-registry-5944f7956-vsk55" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba") : secret "image-registry-tls" not found Apr 22 17:34:08.213364 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.213253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:08.213364 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.213277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:08.213364 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213346 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:34:08.213514 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213355 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.213347311 +0000 UTC m=+35.680608991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : configmap references non-existent config key: service-ca.crt Apr 22 17:34:08.213514 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.213400 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.213386815 +0000 UTC m=+35.680648496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : secret "router-metrics-certs-default" not found Apr 22 17:34:08.282014 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.281979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" event={"ID":"7bfe6b31-c032-43ba-be07-caa12af15041","Type":"ContainerStarted","Data":"cb3b10b5e14bfcc30fea413ea1e61e7a2b3c7611337c4550175ab789106b16d7"} Apr 22 17:34:08.283057 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.283025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" event={"ID":"7526861b-0afa-4db9-9077-282b6ab524f3","Type":"ContainerStarted","Data":"8cceb4c47fe8d65a891703905fdb61586af842e1ff3b2732e31c2eb5825ee485"} Apr 22 17:34:08.284195 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.284156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" event={"ID":"346331ec-1cea-46ea-8952-6af403c257c0","Type":"ContainerStarted","Data":"2d69139e3a15e9da6ab52be92f87a48948e40aedcdddbdadb9ece6886cf602ec"} Apr 22 17:34:08.285337 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.285295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rk27" event={"ID":"2dd100f6-0060-426d-9cf7-a7f9fafa003a","Type":"ContainerStarted","Data":"3041993e3c034790bbf83fb84833de8f890012565e5c6d3fbb9f3233a10b2b43"} Apr 22 17:34:08.286450 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.286409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" event={"ID":"8b7e852e-32bb-4bbe-be45-374b4376ee6d","Type":"ContainerStarted","Data":"9a2548105ff9a6756de2edc8108affa5b1c148081e286d316b940da2378151ab"} Apr 22 17:34:08.287387 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.287366 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" event={"ID":"daeace97-112e-453b-ae2c-bd7b73b63cc1","Type":"ContainerStarted","Data":"4f4979fc716a026ab97bf8739f1fafafd799f227b850dcc4c64fc901df5e4128"} Apr 22 17:34:08.314726 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.314534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:08.314828 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.314772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:08.314828 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.314684 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:34:08.314906 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.314842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:08.314906 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.314857 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls podName:001a3e34-ff95-45f5-a62e-b8389b0e0df0 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.314837493 +0000 UTC m=+35.782099196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp6f7" (UID: "001a3e34-ff95-45f5-a62e-b8389b0e0df0") : secret "samples-operator-tls" not found Apr 22 17:34:08.315014 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.314911 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:08.315014 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.314935 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:08.315014 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.314955 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert podName:59ff7442-70a1-4df1-a3a2-9eff7d027d6e nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.314944266 +0000 UTC m=+35.782205948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pcpz9" (UID: "59ff7442-70a1-4df1-a3a2-9eff7d027d6e") : secret "networking-console-plugin-cert" not found Apr 22 17:34:08.315014 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.314982 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls podName:e9ca84a4-7331-4976-9476-b7842a9814e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.31496964 +0000 UTC m=+35.782231321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vpqhw" (UID: "e9ca84a4-7331-4976-9476-b7842a9814e3") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:08.415925 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.415848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:08.416059 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:08.415947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:08.416059 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.415993 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:08.416059 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.416036 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:08.416157 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.416061 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls podName:b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.416042688 +0000 UTC m=+35.883304372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls") pod "dns-default-vj4xl" (UID: "b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a") : secret "dns-default-metrics-tls" not found Apr 22 17:34:08.416157 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:08.416076 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert podName:dabb0188-2e22-40cb-b765-c6e0a5a0b030 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:09.416069537 +0000 UTC m=+35.883331217 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert") pod "ingress-canary-s28sr" (UID: "dabb0188-2e22-40cb-b765-c6e0a5a0b030") : secret "canary-serving-cert" not found Apr 22 17:34:09.125066 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.125033 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:09.125593 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.125463 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:09.131572 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.131121 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2ldpn\"" Apr 22 17:34:09.131572 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.131380 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:34:09.133581 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.133561 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jqt69\"" Apr 22 17:34:09.141755 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.141362 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:09.229604 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.229573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:09.229716 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.229627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:09.229716 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.229679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:09.229839 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.229743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:09.230694 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230487 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:09.230694 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230510 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5944f7956-vsk55: secret "image-registry-tls" not found Apr 22 17:34:09.230694 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230517 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.230493678 +0000 UTC m=+37.697755360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : configmap references non-existent config key: service-ca.crt Apr 22 17:34:09.230694 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230562 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls podName:84e06372-35da-4f2d-84a9-4f0537970fba nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.230544879 +0000 UTC m=+37.697806564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls") pod "image-registry-5944f7956-vsk55" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba") : secret "image-registry-tls" not found Apr 22 17:34:09.230694 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230624 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:09.230694 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230635 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c75ddc67-c9dl5: secret "image-registry-tls" not found Apr 22 17:34:09.230694 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230670 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls podName:31e11e0d-c49c-4634-b336-26f608c0be83 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.230658281 +0000 UTC m=+37.697919964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls") pod "image-registry-69c75ddc67-c9dl5" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83") : secret "image-registry-tls" not found Apr 22 17:34:09.231168 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230774 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:34:09.231168 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.230822 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.230807069 +0000 UTC m=+37.698068768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : secret "router-metrics-certs-default" not found Apr 22 17:34:09.298146 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.297402 2572 generic.go:358] "Generic (PLEG): container finished" podID="000311a6-600b-4136-89c9-336cdc563106" containerID="092c09e53f71f65134d0abe6a4680d9b11981963e6ea2a2b6e8ea0ae0a059523" exitCode=0 Apr 22 17:34:09.298146 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.297516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerDied","Data":"092c09e53f71f65134d0abe6a4680d9b11981963e6ea2a2b6e8ea0ae0a059523"} Apr 22 17:34:09.314975 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.314942 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mh726"] Apr 22 17:34:09.319317 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:09.319265 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c3eadbd_c6af_4686_bed4_c3a47b257864.slice/crio-7a187ec42785e8c21dcd35f79b4adb3f1031c1901b8e1a0f5fa287a827e72f23 WatchSource:0}: Error finding container 7a187ec42785e8c21dcd35f79b4adb3f1031c1901b8e1a0f5fa287a827e72f23: Status 404 returned error can't find the container with id 7a187ec42785e8c21dcd35f79b4adb3f1031c1901b8e1a0f5fa287a827e72f23 Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.330584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.330664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.330730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.330885 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.330941 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert podName:59ff7442-70a1-4df1-a3a2-9eff7d027d6e nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.330922479 +0000 UTC m=+37.798184163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pcpz9" (UID: "59ff7442-70a1-4df1-a3a2-9eff7d027d6e") : secret "networking-console-plugin-cert" not found Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.331001 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.331034 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls podName:001a3e34-ff95-45f5-a62e-b8389b0e0df0 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.331022721 +0000 UTC m=+37.798284416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp6f7" (UID: "001a3e34-ff95-45f5-a62e-b8389b0e0df0") : secret "samples-operator-tls" not found Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.331097 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:09.331666 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.331137 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls podName:e9ca84a4-7331-4976-9476-b7842a9814e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.331119097 +0000 UTC m=+37.798380782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vpqhw" (UID: "e9ca84a4-7331-4976-9476-b7842a9814e3") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:09.432045 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.432010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:09.432320 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:09.432300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:09.432557 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.432539 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:09.432646 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.432608 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls podName:b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.432588232 +0000 UTC m=+37.899849927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls") pod "dns-default-vj4xl" (UID: "b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a") : secret "dns-default-metrics-tls" not found Apr 22 17:34:09.433045 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.433027 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:09.433125 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:09.433080 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert podName:dabb0188-2e22-40cb-b765-c6e0a5a0b030 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:11.433062847 +0000 UTC m=+37.900324536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert") pod "ingress-canary-s28sr" (UID: "dabb0188-2e22-40cb-b765-c6e0a5a0b030") : secret "canary-serving-cert" not found Apr 22 17:34:10.304328 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:10.304288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mh726" event={"ID":"7c3eadbd-c6af-4686-bed4-c3a47b257864","Type":"ContainerStarted","Data":"7a187ec42785e8c21dcd35f79b4adb3f1031c1901b8e1a0f5fa287a827e72f23"} Apr 22 17:34:10.310727 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:10.310691 2572 generic.go:358] "Generic (PLEG): container finished" podID="000311a6-600b-4136-89c9-336cdc563106" containerID="aab35fc0919b39df6d85eff80ab6ae29618a8e3706fde8dbb8d770ead0c8451b" exitCode=0 Apr 22 17:34:10.310893 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:10.310765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerDied","Data":"aab35fc0919b39df6d85eff80ab6ae29618a8e3706fde8dbb8d770ead0c8451b"} Apr 22 17:34:11.253183 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.253048 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:11.253183 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.253108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:11.253183 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.253147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253204 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253271 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253569 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5944f7956-vsk55: secret "image-registry-tls" not found Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253333 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.253308188 +0000 UTC m=+41.720569868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : configmap references non-existent config key: service-ca.crt Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253622 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.253603557 +0000 UTC m=+41.720865236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : secret "router-metrics-certs-default" not found Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253650 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls podName:84e06372-35da-4f2d-84a9-4f0537970fba nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.253642766 +0000 UTC m=+41.720904447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls") pod "image-registry-5944f7956-vsk55" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba") : secret "image-registry-tls" not found Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.253707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253892 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253905 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c75ddc67-c9dl5: secret "image-registry-tls" not found Apr 22 17:34:11.254044 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.253946 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls podName:31e11e0d-c49c-4634-b336-26f608c0be83 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.253934427 +0000 UTC m=+41.721196106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls") pod "image-registry-69c75ddc67-c9dl5" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83") : secret "image-registry-tls" not found Apr 22 17:34:11.355229 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.355191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.355348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.355380 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.355445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.355474 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls podName:001a3e34-ff95-45f5-a62e-b8389b0e0df0 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.355452752 +0000 UTC m=+41.822714446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp6f7" (UID: "001a3e34-ff95-45f5-a62e-b8389b0e0df0") : secret "samples-operator-tls" not found Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.355502 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.355517 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.355557 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert podName:59ff7442-70a1-4df1-a3a2-9eff7d027d6e nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.355540693 +0000 UTC m=+41.822802376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pcpz9" (UID: "59ff7442-70a1-4df1-a3a2-9eff7d027d6e") : secret "networking-console-plugin-cert" not found Apr 22 17:34:11.355725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.355585 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls podName:e9ca84a4-7331-4976-9476-b7842a9814e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.355578193 +0000 UTC m=+41.822839873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vpqhw" (UID: "e9ca84a4-7331-4976-9476-b7842a9814e3") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:11.456318 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.456278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:11.456536 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.456435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:11.456536 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.456460 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:11.456536 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.456530 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert podName:dabb0188-2e22-40cb-b765-c6e0a5a0b030 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.456509514 +0000 UTC m=+41.923771212 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert") pod "ingress-canary-s28sr" (UID: "dabb0188-2e22-40cb-b765-c6e0a5a0b030") : secret "canary-serving-cert" not found Apr 22 17:34:11.456725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.456569 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:11.456725 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:11.456610 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls podName:b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.456598011 +0000 UTC m=+41.923859699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls") pod "dns-default-vj4xl" (UID: "b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a") : secret "dns-default-metrics-tls" not found Apr 22 17:34:11.473278 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.473246 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tngvx"] Apr 22 17:34:11.481625 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.481597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.483905 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.483877 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:34:11.486637 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.486608 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tngvx"] Apr 22 17:34:11.658563 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.658435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/afc146fd-f74a-41e0-b236-6b55673d7657-dbus\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.658563 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.658508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afc146fd-f74a-41e0-b236-6b55673d7657-original-pull-secret\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.658563 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.658614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/afc146fd-f74a-41e0-b236-6b55673d7657-kubelet-config\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.760228 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.760193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/afc146fd-f74a-41e0-b236-6b55673d7657-dbus\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.760228 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.760234 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afc146fd-f74a-41e0-b236-6b55673d7657-original-pull-secret\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.760471 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.760409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/afc146fd-f74a-41e0-b236-6b55673d7657-kubelet-config\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.760530 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.760511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/afc146fd-f74a-41e0-b236-6b55673d7657-dbus\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.760576 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.760536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/afc146fd-f74a-41e0-b236-6b55673d7657-kubelet-config\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.763859 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.763687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afc146fd-f74a-41e0-b236-6b55673d7657-original-pull-secret\") pod \"global-pull-secret-syncer-tngvx\" (UID: \"afc146fd-f74a-41e0-b236-6b55673d7657\") " pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:11.793812 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:11.793777 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tngvx" Apr 22 17:34:15.293242 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.293212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:15.293673 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.293253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:15.293673 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.293335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:15.293673 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.293394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:15.293673 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.293621 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:15.293673 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.293640 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c75ddc67-c9dl5: secret "image-registry-tls" not found Apr 22 17:34:15.293936 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.293700 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls podName:31e11e0d-c49c-4634-b336-26f608c0be83 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.29368122 +0000 UTC m=+49.760942908 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls") pod "image-registry-69c75ddc67-c9dl5" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83") : secret "image-registry-tls" not found Apr 22 17:34:15.294145 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.294123 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:15.294218 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.294140 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.29411962 +0000 UTC m=+49.761381310 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : configmap references non-existent config key: service-ca.crt Apr 22 17:34:15.294218 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.294147 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5944f7956-vsk55: secret "image-registry-tls" not found Apr 22 17:34:15.294218 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.294196 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls podName:84e06372-35da-4f2d-84a9-4f0537970fba nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.294183171 +0000 UTC m=+49.761444852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls") pod "image-registry-5944f7956-vsk55" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba") : secret "image-registry-tls" not found Apr 22 17:34:15.294218 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.294202 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:34:15.294453 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.294245 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.294233875 +0000 UTC m=+49.761495554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : secret "router-metrics-certs-default" not found Apr 22 17:34:15.364817 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.364792 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tngvx"] Apr 22 17:34:15.367823 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:15.367797 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafc146fd_f74a_41e0_b236_6b55673d7657.slice/crio-ad10d9e9ebd4b3d122c03e62af833c20d4ce4fa46020677f80e3471dbb4ed6a4 WatchSource:0}: Error finding container ad10d9e9ebd4b3d122c03e62af833c20d4ce4fa46020677f80e3471dbb4ed6a4: Status 404 returned error can't find the container with id ad10d9e9ebd4b3d122c03e62af833c20d4ce4fa46020677f80e3471dbb4ed6a4 Apr 22 17:34:15.395199 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.395147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:15.395322 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.395225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:15.395475 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.395384 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:34:15.395475 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.395437 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:15.395584 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.395480 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls podName:001a3e34-ff95-45f5-a62e-b8389b0e0df0 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.395459736 +0000 UTC m=+49.862721433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp6f7" (UID: "001a3e34-ff95-45f5-a62e-b8389b0e0df0") : secret "samples-operator-tls" not found Apr 22 17:34:15.395584 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.395504 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls podName:e9ca84a4-7331-4976-9476-b7842a9814e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.395494584 +0000 UTC m=+49.862756266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vpqhw" (UID: "e9ca84a4-7331-4976-9476-b7842a9814e3") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:15.395584 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.395532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:15.395750 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.395676 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:15.395750 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.395716 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert podName:59ff7442-70a1-4df1-a3a2-9eff7d027d6e nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.395705165 +0000 UTC m=+49.862966859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pcpz9" (UID: "59ff7442-70a1-4df1-a3a2-9eff7d027d6e") : secret "networking-console-plugin-cert" not found Apr 22 17:34:15.496667 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.496575 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:15.496784 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:15.496725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:15.497084 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.496986 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:15.497084 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.497051 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert podName:dabb0188-2e22-40cb-b765-c6e0a5a0b030 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.497030249 +0000 UTC m=+49.964291950 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert") pod "ingress-canary-s28sr" (UID: "dabb0188-2e22-40cb-b765-c6e0a5a0b030") : secret "canary-serving-cert" not found Apr 22 17:34:15.497471 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.497358 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:15.497471 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:15.497444 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls podName:b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:23.497402117 +0000 UTC m=+49.964663800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls") pod "dns-default-vj4xl" (UID: "b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a") : secret "dns-default-metrics-tls" not found Apr 22 17:34:16.328401 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.328345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" event={"ID":"346331ec-1cea-46ea-8952-6af403c257c0","Type":"ContainerStarted","Data":"7167e4fa3954dac7a1d0036ad421e79895b70cd6b2f1999466a60010488d81ed"} Apr 22 17:34:16.330300 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.330257 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rk27" event={"ID":"2dd100f6-0060-426d-9cf7-a7f9fafa003a","Type":"ContainerStarted","Data":"dd2e1785346a61fe4c5b4a25ba44e7d9b528e3b5ee1c2bd7c11bcbc56e77e236"} Apr 22 17:34:16.335105 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.335076 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" event={"ID":"8b7e852e-32bb-4bbe-be45-374b4376ee6d","Type":"ContainerStarted","Data":"c523b8ae86d51505845ed8c009b0a1289b6c3b8018f4bc2f1b192f999184e9c1"} Apr 22 17:34:16.337729 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.337320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" event={"ID":"daeace97-112e-453b-ae2c-bd7b73b63cc1","Type":"ContainerStarted","Data":"89cbe881db7a53d1d48bf8273912f5aae51a0f7d1967444179e6e85188c0400f"} Apr 22 17:34:16.342235 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.341805 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mh726" event={"ID":"7c3eadbd-c6af-4686-bed4-c3a47b257864","Type":"ContainerStarted","Data":"4aa5a10594b8e970919bfbc4f1445ac6ff07de26738ca6d900962139e15b4b63"} Apr 22 17:34:16.342235 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.341988 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:16.343555 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.343061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tngvx" event={"ID":"afc146fd-f74a-41e0-b236-6b55673d7657","Type":"ContainerStarted","Data":"ad10d9e9ebd4b3d122c03e62af833c20d4ce4fa46020677f80e3471dbb4ed6a4"} Apr 22 17:34:16.344994 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.344473 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" podStartSLOduration=22.305272501 podStartE2EDuration="29.344411235s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:08.190378588 +0000 UTC m=+34.657640280" lastFinishedPulling="2026-04-22 17:34:15.229517329 +0000 UTC m=+41.696779014" observedRunningTime="2026-04-22 17:34:16.343280789 +0000 UTC m=+42.810542493" watchObservedRunningTime="2026-04-22 17:34:16.344411235 +0000 UTC m=+42.811672939" Apr 22 17:34:16.348246 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.348212 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rdl42" event={"ID":"000311a6-600b-4136-89c9-336cdc563106","Type":"ContainerStarted","Data":"60c7b51fdbdd1b46073a18546a4ccd20acd326998e3777f0f05637b81a275616"} Apr 22 17:34:16.350334 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.350317 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/0.log" Apr 22 17:34:16.350447 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.350352 2572 generic.go:358] "Generic (PLEG): container finished" podID="7bfe6b31-c032-43ba-be07-caa12af15041" containerID="c69b80fd9b90dab8ab6118bc6ffd55aa949daaf7956cddcacbb989ade235fdbd" exitCode=255 Apr 22 17:34:16.350567 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.350540 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" event={"ID":"7bfe6b31-c032-43ba-be07-caa12af15041","Type":"ContainerDied","Data":"c69b80fd9b90dab8ab6118bc6ffd55aa949daaf7956cddcacbb989ade235fdbd"} Apr 22 17:34:16.352897 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.352294 2572 scope.go:117] "RemoveContainer" containerID="c69b80fd9b90dab8ab6118bc6ffd55aa949daaf7956cddcacbb989ade235fdbd" Apr 22 17:34:16.352897 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.352600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" event={"ID":"7526861b-0afa-4db9-9077-282b6ab524f3","Type":"ContainerStarted","Data":"3b5944634ca4c32043fe3e691f913f44425bc9224e6b4c239efdaa89f9b45eb4"} Apr 22 17:34:16.357727 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.357685 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mh726" podStartSLOduration=36.271033955 podStartE2EDuration="42.357670405s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:34:09.322345218 +0000 UTC m=+35.789606898" lastFinishedPulling="2026-04-22 17:34:15.408981652 +0000 UTC m=+41.876243348" observedRunningTime="2026-04-22 17:34:16.356153311 +0000 UTC m=+42.823415014" watchObservedRunningTime="2026-04-22 17:34:16.357670405 +0000 UTC m=+42.824932099" Apr 22 17:34:16.416638 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.414780 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2qtfd" podStartSLOduration=22.366996811 podStartE2EDuration="29.414763415s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:08.190008375 +0000 UTC m=+34.657270059" lastFinishedPulling="2026-04-22 17:34:15.237774982 +0000 UTC m=+41.705036663" observedRunningTime="2026-04-22 17:34:16.414516596 +0000 UTC m=+42.881778299" watchObservedRunningTime="2026-04-22 17:34:16.414763415 +0000 UTC m=+42.882025123" Apr 22 17:34:16.416638 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.416290 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-7rk27" podStartSLOduration=22.377256553 podStartE2EDuration="29.41627361s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:08.190266808 +0000 UTC m=+34.657528493" lastFinishedPulling="2026-04-22 17:34:15.229283866 +0000 UTC m=+41.696545550" observedRunningTime="2026-04-22 17:34:16.382249977 +0000 UTC m=+42.849511680" watchObservedRunningTime="2026-04-22 17:34:16.41627361 +0000 UTC m=+42.883535322" Apr 22 17:34:16.463892 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.463120 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rdl42" podStartSLOduration=11.22481529 podStartE2EDuration="42.463102295s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:33:36.974905753 +0000 UTC m=+3.442167432" lastFinishedPulling="2026-04-22 17:34:08.213192747 +0000 UTC m=+34.680454437" observedRunningTime="2026-04-22 17:34:16.462091114 +0000 UTC m=+42.929352817" watchObservedRunningTime="2026-04-22 17:34:16.463102295 +0000 UTC m=+42.930364011" Apr 22 17:34:16.464191 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.464146 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v8lmw" podStartSLOduration=22.424653228 podStartE2EDuration="29.464134992s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:08.190168289 +0000 UTC m=+34.657429991" lastFinishedPulling="2026-04-22 17:34:15.229650076 +0000 UTC m=+41.696911755" observedRunningTime="2026-04-22 17:34:16.433690985 +0000 UTC m=+42.900952688" watchObservedRunningTime="2026-04-22 17:34:16.464134992 +0000 UTC m=+42.931396694" Apr 22 17:34:16.496389 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:16.496284 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" podStartSLOduration=22.457576734 podStartE2EDuration="29.496265951s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:08.190348037 +0000 UTC m=+34.657609720" lastFinishedPulling="2026-04-22 17:34:15.229037252 +0000 UTC m=+41.696298937" observedRunningTime="2026-04-22 17:34:16.494961487 +0000 UTC m=+42.962223186" watchObservedRunningTime="2026-04-22 17:34:16.496265951 +0000 UTC m=+42.963527654" Apr 22 17:34:17.187349 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.187313 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6"] Apr 22 17:34:17.210415 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.210248 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6"] Apr 22 17:34:17.210415 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.210399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" Apr 22 17:34:17.213338 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.213310 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:17.213486 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.213316 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-rxmss\"" Apr 22 17:34:17.214332 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.214151 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 17:34:17.318998 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.318963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k869m\" (UniqueName: \"kubernetes.io/projected/ff80dfe5-183b-4c0c-90e5-3467e987deec-kube-api-access-k869m\") pod \"migrator-74bb7799d9-krtt6\" (UID: \"ff80dfe5-183b-4c0c-90e5-3467e987deec\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" Apr 22 17:34:17.357782 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.357707 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:34:17.358270 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.358121 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/0.log" Apr 22 17:34:17.358270 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.358157 2572 generic.go:358] "Generic (PLEG): container finished" podID="7bfe6b31-c032-43ba-be07-caa12af15041" containerID="474ad34552d5da1dcd0e1b2a9cf1aefd4dea617ee2be9559f7a0bdd4a280c35b" exitCode=255 Apr 22 17:34:17.359750 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.358861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" event={"ID":"7bfe6b31-c032-43ba-be07-caa12af15041","Type":"ContainerDied","Data":"474ad34552d5da1dcd0e1b2a9cf1aefd4dea617ee2be9559f7a0bdd4a280c35b"} Apr 22 17:34:17.359750 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.358905 2572 scope.go:117] "RemoveContainer" containerID="c69b80fd9b90dab8ab6118bc6ffd55aa949daaf7956cddcacbb989ade235fdbd" Apr 22 17:34:17.359956 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.359769 2572 scope.go:117] "RemoveContainer" containerID="474ad34552d5da1dcd0e1b2a9cf1aefd4dea617ee2be9559f7a0bdd4a280c35b" Apr 22 17:34:17.360008 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:17.359950 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vvmj2_openshift-console-operator(7bfe6b31-c032-43ba-be07-caa12af15041)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" podUID="7bfe6b31-c032-43ba-be07-caa12af15041" Apr 22 17:34:17.420338 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.420305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k869m\" (UniqueName: \"kubernetes.io/projected/ff80dfe5-183b-4c0c-90e5-3467e987deec-kube-api-access-k869m\") pod \"migrator-74bb7799d9-krtt6\" (UID: \"ff80dfe5-183b-4c0c-90e5-3467e987deec\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" Apr 22 17:34:17.430842 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.430808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k869m\" (UniqueName: \"kubernetes.io/projected/ff80dfe5-183b-4c0c-90e5-3467e987deec-kube-api-access-k869m\") pod \"migrator-74bb7799d9-krtt6\" (UID: \"ff80dfe5-183b-4c0c-90e5-3467e987deec\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" Apr 22 17:34:17.524352 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.524281 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" Apr 22 17:34:17.666465 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.666414 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6"] Apr 22 17:34:17.683603 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:17.683400 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff80dfe5_183b_4c0c_90e5_3467e987deec.slice/crio-2e3fa10ceb1068c7415d427b9868cbe8d4198e3a93741075e89fb1acef6f232e WatchSource:0}: Error finding container 2e3fa10ceb1068c7415d427b9868cbe8d4198e3a93741075e89fb1acef6f232e: Status 404 returned error can't find the container with id 2e3fa10ceb1068c7415d427b9868cbe8d4198e3a93741075e89fb1acef6f232e Apr 22 17:34:17.815361 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.815270 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:17.815361 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.815309 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:17.977031 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:17.977001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lxs9q_d3515122-d7cf-41fe-855d-d19ccfe73070/dns-node-resolver/0.log" Apr 22 17:34:18.363938 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:18.363897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" event={"ID":"ff80dfe5-183b-4c0c-90e5-3467e987deec","Type":"ContainerStarted","Data":"2e3fa10ceb1068c7415d427b9868cbe8d4198e3a93741075e89fb1acef6f232e"} Apr 22 17:34:18.365902 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:18.365764 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:34:18.366158 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:18.366137 2572 scope.go:117] "RemoveContainer" containerID="474ad34552d5da1dcd0e1b2a9cf1aefd4dea617ee2be9559f7a0bdd4a280c35b" Apr 22 17:34:18.366678 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:18.366396 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vvmj2_openshift-console-operator(7bfe6b31-c032-43ba-be07-caa12af15041)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" podUID="7bfe6b31-c032-43ba-be07-caa12af15041" Apr 22 17:34:18.572829 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:18.572801 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5lt7k_5ea00ae3-4a64-4435-be9b-6d9aec346440/node-ca/0.log" Apr 22 17:34:19.369365 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:19.369331 2572 scope.go:117] "RemoveContainer" containerID="474ad34552d5da1dcd0e1b2a9cf1aefd4dea617ee2be9559f7a0bdd4a280c35b" Apr 22 17:34:19.369819 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:19.369568 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vvmj2_openshift-console-operator(7bfe6b31-c032-43ba-be07-caa12af15041)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" podUID="7bfe6b31-c032-43ba-be07-caa12af15041" Apr 22 17:34:20.017582 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.017548 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9rfrc"] Apr 22 17:34:20.041152 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.041094 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9rfrc"] Apr 22 17:34:20.041330 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.041227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.044108 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.044077 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 17:34:20.044219 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.044113 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 17:34:20.044219 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.044162 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 17:34:20.044412 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.044396 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 17:34:20.045286 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.044996 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-8vnvp\"" Apr 22 17:34:20.144467 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.144435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a26ceba-c8b2-4875-b970-8c11dba4c575-signing-key\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.144646 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.144521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4ph\" (UniqueName: \"kubernetes.io/projected/1a26ceba-c8b2-4875-b970-8c11dba4c575-kube-api-access-xs4ph\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.144712 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.144642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a26ceba-c8b2-4875-b970-8c11dba4c575-signing-cabundle\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.245951 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.245911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a26ceba-c8b2-4875-b970-8c11dba4c575-signing-key\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.246151 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.246092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4ph\" (UniqueName: \"kubernetes.io/projected/1a26ceba-c8b2-4875-b970-8c11dba4c575-kube-api-access-xs4ph\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.246221 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.246199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a26ceba-c8b2-4875-b970-8c11dba4c575-signing-cabundle\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.246932 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.246909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a26ceba-c8b2-4875-b970-8c11dba4c575-signing-cabundle\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.248819 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.248796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a26ceba-c8b2-4875-b970-8c11dba4c575-signing-key\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.254439 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.254400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4ph\" (UniqueName: \"kubernetes.io/projected/1a26ceba-c8b2-4875-b970-8c11dba4c575-kube-api-access-xs4ph\") pod \"service-ca-865cb79987-9rfrc\" (UID: \"1a26ceba-c8b2-4875-b970-8c11dba4c575\") " pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:20.352723 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:20.352613 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9rfrc" Apr 22 17:34:21.279152 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:21.278962 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9rfrc"] Apr 22 17:34:21.287199 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:21.287168 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a26ceba_c8b2_4875_b970_8c11dba4c575.slice/crio-4bdf5c1492988f24e7b3ea52ccfe3d3a98368d8b8301b33a47d8771dfb479280 WatchSource:0}: Error finding container 4bdf5c1492988f24e7b3ea52ccfe3d3a98368d8b8301b33a47d8771dfb479280: Status 404 returned error can't find the container with id 4bdf5c1492988f24e7b3ea52ccfe3d3a98368d8b8301b33a47d8771dfb479280 Apr 22 17:34:21.377116 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:21.377046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tngvx" event={"ID":"afc146fd-f74a-41e0-b236-6b55673d7657","Type":"ContainerStarted","Data":"80c4a43687d5cc31b3cce8cdfd598e83d92bdc7d3b78705589539413e70ed409"} Apr 22 17:34:21.378368 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:21.378335 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9rfrc" event={"ID":"1a26ceba-c8b2-4875-b970-8c11dba4c575","Type":"ContainerStarted","Data":"0caa0970888db4ae69c56bdf9fe210c1dfc5db01179faf105e70324e1544ec38"} Apr 22 17:34:21.378368 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:21.378368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9rfrc" event={"ID":"1a26ceba-c8b2-4875-b970-8c11dba4c575","Type":"ContainerStarted","Data":"4bdf5c1492988f24e7b3ea52ccfe3d3a98368d8b8301b33a47d8771dfb479280"} Apr 22 17:34:21.379822 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:21.379800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" event={"ID":"ff80dfe5-183b-4c0c-90e5-3467e987deec","Type":"ContainerStarted","Data":"e0aa8e0773ae434aee5411ac0dc54316892bfb6d5da2ff82854a5fb39a266101"} Apr 22 17:34:22.384976 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:22.384879 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" event={"ID":"ff80dfe5-183b-4c0c-90e5-3467e987deec","Type":"ContainerStarted","Data":"dceb849b0ca8d60c1461883d66f63a650e31e4da384cb71581ca325e861a65a3"} Apr 22 17:34:22.423055 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:22.422170 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-9rfrc" podStartSLOduration=3.422153042 podStartE2EDuration="3.422153042s" podCreationTimestamp="2026-04-22 17:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:22.420799053 +0000 UTC m=+48.888060757" watchObservedRunningTime="2026-04-22 17:34:22.422153042 +0000 UTC m=+48.889414745" Apr 22 17:34:22.451168 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:22.450896 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-krtt6" podStartSLOduration=1.9845889479999999 podStartE2EDuration="5.450878649s" podCreationTimestamp="2026-04-22 17:34:17 +0000 UTC" firstStartedPulling="2026-04-22 17:34:17.6856369 +0000 UTC m=+44.152898592" lastFinishedPulling="2026-04-22 17:34:21.151926599 +0000 UTC m=+47.619188293" observedRunningTime="2026-04-22 17:34:22.450565643 +0000 UTC m=+48.917827349" watchObservedRunningTime="2026-04-22 17:34:22.450878649 +0000 UTC m=+48.918140352" Apr 22 17:34:22.470931 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:22.470860 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tngvx" podStartSLOduration=5.687996564 podStartE2EDuration="11.470840997s" podCreationTimestamp="2026-04-22 17:34:11 +0000 UTC" firstStartedPulling="2026-04-22 17:34:15.379200666 +0000 UTC m=+41.846462360" lastFinishedPulling="2026-04-22 17:34:21.16204511 +0000 UTC m=+47.629306793" observedRunningTime="2026-04-22 17:34:22.46975412 +0000 UTC m=+48.937015822" watchObservedRunningTime="2026-04-22 17:34:22.470840997 +0000 UTC m=+48.938102701" Apr 22 17:34:23.373983 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.373949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:23.373983 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.373985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:23.374245 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.374006 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:23.374245 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374114 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:34:23.374245 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.374127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:23.374245 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374187 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.374166419 +0000 UTC m=+65.841428117 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : secret "router-metrics-certs-default" not found Apr 22 17:34:23.374245 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374189 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:23.374245 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374201 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c75ddc67-c9dl5: secret "image-registry-tls" not found Apr 22 17:34:23.374245 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374230 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:23.374520 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374256 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5944f7956-vsk55: secret "image-registry-tls" not found Apr 22 17:34:23.374520 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374236 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls podName:31e11e0d-c49c-4634-b336-26f608c0be83 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.374226862 +0000 UTC m=+65.841488579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls") pod "image-registry-69c75ddc67-c9dl5" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83") : secret "image-registry-tls" not found Apr 22 17:34:23.374520 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374299 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle podName:17af67ec-8577-45de-abbb-01a7199ee7cd nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.37428469 +0000 UTC m=+65.841546370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle") pod "router-default-9d9fb4b58-dpchv" (UID: "17af67ec-8577-45de-abbb-01a7199ee7cd") : configmap references non-existent config key: service-ca.crt Apr 22 17:34:23.374520 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.374314 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls podName:84e06372-35da-4f2d-84a9-4f0537970fba nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.374308017 +0000 UTC m=+65.841569696 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls") pod "image-registry-5944f7956-vsk55" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba") : secret "image-registry-tls" not found Apr 22 17:34:23.475127 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.475094 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:23.475614 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.475158 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert podName:59ff7442-70a1-4df1-a3a2-9eff7d027d6e nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.475145053 +0000 UTC m=+65.942406733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pcpz9" (UID: "59ff7442-70a1-4df1-a3a2-9eff7d027d6e") : secret "networking-console-plugin-cert" not found Apr 22 17:34:23.475614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.475082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:23.475614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.475371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:23.475614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.475478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:23.475614 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.475583 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:23.475895 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.475622 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls podName:e9ca84a4-7331-4976-9476-b7842a9814e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.475612457 +0000 UTC m=+65.942874141 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vpqhw" (UID: "e9ca84a4-7331-4976-9476-b7842a9814e3") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:34:23.477847 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.477824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/001a3e34-ff95-45f5-a62e-b8389b0e0df0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp6f7\" (UID: \"001a3e34-ff95-45f5-a62e-b8389b0e0df0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:23.479548 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.479535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" Apr 22 17:34:23.576149 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.576118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:23.576309 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.576228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:23.576309 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.576288 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:23.576436 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.576332 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:23.576436 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.576371 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls podName:b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.576347996 +0000 UTC m=+66.043609679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls") pod "dns-default-vj4xl" (UID: "b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a") : secret "dns-default-metrics-tls" not found Apr 22 17:34:23.576436 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:34:23.576391 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert podName:dabb0188-2e22-40cb-b765-c6e0a5a0b030 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.576380955 +0000 UTC m=+66.043642642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert") pod "ingress-canary-s28sr" (UID: "dabb0188-2e22-40cb-b765-c6e0a5a0b030") : secret "canary-serving-cert" not found Apr 22 17:34:23.607766 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:23.607735 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7"] Apr 22 17:34:24.392463 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:24.392415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" event={"ID":"001a3e34-ff95-45f5-a62e-b8389b0e0df0","Type":"ContainerStarted","Data":"a53563993384dac8de76d1f6b248f64ce936c3e3ef5c94b191d62f524d95a821"} Apr 22 17:34:26.400173 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:26.400139 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" event={"ID":"001a3e34-ff95-45f5-a62e-b8389b0e0df0","Type":"ContainerStarted","Data":"2d800fe3a372a86549f6aaf1302b71419ef3ccd0cea6546cdacd00181bb90f84"} Apr 22 17:34:26.400639 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:26.400181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" event={"ID":"001a3e34-ff95-45f5-a62e-b8389b0e0df0","Type":"ContainerStarted","Data":"181d26e9bec1d7d765286405972917503587c704c4e6f9b82dc8e8c945d583b0"} Apr 22 17:34:26.417819 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:26.417768 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp6f7" podStartSLOduration=37.5623693 podStartE2EDuration="39.417750521s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:23.683022148 +0000 UTC m=+50.150283829" lastFinishedPulling="2026-04-22 17:34:25.538403362 +0000 UTC m=+52.005665050" observedRunningTime="2026-04-22 17:34:26.417530337 +0000 UTC m=+52.884792059" watchObservedRunningTime="2026-04-22 17:34:26.417750521 +0000 UTC m=+52.885012225" Apr 22 17:34:31.124258 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:31.124224 2572 scope.go:117] "RemoveContainer" containerID="474ad34552d5da1dcd0e1b2a9cf1aefd4dea617ee2be9559f7a0bdd4a280c35b" Apr 22 17:34:31.415219 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:31.415184 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:34:31.415390 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:31.415286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" event={"ID":"7bfe6b31-c032-43ba-be07-caa12af15041","Type":"ContainerStarted","Data":"71dd0a8433e3d5bdff0b8384c2abf7be05a7e327bd65c4e3e6e030327c120816"} Apr 22 17:34:31.415740 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:31.415702 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:31.435883 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:31.435811 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" podStartSLOduration=37.39671408 podStartE2EDuration="44.435791168s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:08.1899584 +0000 UTC m=+34.657220085" lastFinishedPulling="2026-04-22 17:34:15.22903548 +0000 UTC m=+41.696297173" observedRunningTime="2026-04-22 17:34:31.43513881 +0000 UTC m=+57.902400523" watchObservedRunningTime="2026-04-22 17:34:31.435791168 +0000 UTC m=+57.903052870" Apr 22 17:34:32.415582 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:32.415527 2572 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-vvmj2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.9:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 22 17:34:32.416028 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:32.415600 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" podUID="7bfe6b31-c032-43ba-be07-caa12af15041" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.9:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 22 17:34:32.446517 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:32.446489 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-vvmj2" Apr 22 17:34:33.282663 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:33.282636 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knlln" Apr 22 17:34:39.426101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.426063 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:39.426101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.426107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:39.426630 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.426134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:39.426630 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.426188 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:39.426744 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.426729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17af67ec-8577-45de-abbb-01a7199ee7cd-service-ca-bundle\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:39.428687 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.428658 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17af67ec-8577-45de-abbb-01a7199ee7cd-metrics-certs\") pod \"router-default-9d9fb4b58-dpchv\" (UID: \"17af67ec-8577-45de-abbb-01a7199ee7cd\") " pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:39.428790 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.428698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"image-registry-5944f7956-vsk55\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:39.428830 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.428805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"image-registry-69c75ddc67-c9dl5\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:39.527595 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.527557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:39.527762 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.527693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:39.530075 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.530036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ca84a4-7331-4976-9476-b7842a9814e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vpqhw\" (UID: \"e9ca84a4-7331-4976-9476-b7842a9814e3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:39.530188 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.530038 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/59ff7442-70a1-4df1-a3a2-9eff7d027d6e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pcpz9\" (UID: \"59ff7442-70a1-4df1-a3a2-9eff7d027d6e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:39.564257 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.564223 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mmlbf\"" Apr 22 17:34:39.572445 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.572405 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:39.575261 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.575242 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:39.596791 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.596761 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nn5dh\"" Apr 22 17:34:39.604989 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.604956 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:39.628485 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.628445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:39.628661 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.628549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:39.632387 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.632331 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabb0188-2e22-40cb-b765-c6e0a5a0b030-cert\") pod \"ingress-canary-s28sr\" (UID: \"dabb0188-2e22-40cb-b765-c6e0a5a0b030\") " pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:39.633255 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.633206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a-metrics-tls\") pod \"dns-default-vj4xl\" (UID: \"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a\") " pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:39.687408 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.687157 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-b6264\"" Apr 22 17:34:39.696332 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.695849 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" Apr 22 17:34:39.713575 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.713327 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rxzrr\"" Apr 22 17:34:39.715328 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.715303 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-22kpx"] Apr 22 17:34:39.715461 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.715324 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" Apr 22 17:34:39.751062 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.750939 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-22kpx"] Apr 22 17:34:39.751062 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.750980 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69c75ddc67-c9dl5"] Apr 22 17:34:39.751309 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.751150 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.755724 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.754657 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-mtj7t\"" Apr 22 17:34:39.755724 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.755454 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:34:39.756258 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.756078 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:34:39.759307 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.759093 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-k4t5h\"" Apr 22 17:34:39.769020 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.766544 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s28sr" Apr 22 17:34:39.791328 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.776257 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ghfnt\"" Apr 22 17:34:39.791328 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.785952 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:39.798538 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.797572 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5944f7956-vsk55"] Apr 22 17:34:39.818948 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.818905 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69c75ddc67-c9dl5"] Apr 22 17:34:39.823600 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:39.823554 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e11e0d_c49c_4634_b336_26f608c0be83.slice/crio-054dd7308974508782b0ed3ba184e37cccbe651fb193bb89ae615b5e29e4d77a WatchSource:0}: Error finding container 054dd7308974508782b0ed3ba184e37cccbe651fb193bb89ae615b5e29e4d77a: Status 404 returned error can't find the container with id 054dd7308974508782b0ed3ba184e37cccbe651fb193bb89ae615b5e29e4d77a Apr 22 17:34:39.827155 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.825584 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-sm7xt"] Apr 22 17:34:39.832023 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.831697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/65f406bd-2301-4de9-9b94-1fb285e03d6e-crio-socket\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.832023 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.831750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/65f406bd-2301-4de9-9b94-1fb285e03d6e-data-volume\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.832023 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.831860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/65f406bd-2301-4de9-9b94-1fb285e03d6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.832023 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.831904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58f2\" (UniqueName: \"kubernetes.io/projected/65f406bd-2301-4de9-9b94-1fb285e03d6e-kube-api-access-w58f2\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.832023 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.831938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/65f406bd-2301-4de9-9b94-1fb285e03d6e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.867100 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.865342 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-sm7xt"] Apr 22 17:34:39.867100 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.865397 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-9d9fb4b58-dpchv"] Apr 22 17:34:39.867100 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.865533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-sm7xt" Apr 22 17:34:39.871081 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.870509 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:34:39.871081 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.870759 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:34:39.871081 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.870933 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-848bn\"" Apr 22 17:34:39.885296 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.884392 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw"] Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.932462 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w58f2\" (UniqueName: \"kubernetes.io/projected/65f406bd-2301-4de9-9b94-1fb285e03d6e-kube-api-access-w58f2\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.932513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/65f406bd-2301-4de9-9b94-1fb285e03d6e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.932551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f882c\" (UniqueName: \"kubernetes.io/projected/4cb6c6fd-89a2-40ee-b3e2-d562a853e308-kube-api-access-f882c\") pod \"downloads-6bcc868b7-sm7xt\" (UID: \"4cb6c6fd-89a2-40ee-b3e2-d562a853e308\") " pod="openshift-console/downloads-6bcc868b7-sm7xt" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.932595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/65f406bd-2301-4de9-9b94-1fb285e03d6e-crio-socket\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.932635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/65f406bd-2301-4de9-9b94-1fb285e03d6e-data-volume\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.932710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.932755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/65f406bd-2301-4de9-9b94-1fb285e03d6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.934158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/65f406bd-2301-4de9-9b94-1fb285e03d6e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.934614 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.934544 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/65f406bd-2301-4de9-9b94-1fb285e03d6e-crio-socket\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.937248 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.936712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/65f406bd-2301-4de9-9b94-1fb285e03d6e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.937248 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.937231 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:34:39.939231 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.939190 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9"] Apr 22 17:34:39.943803 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.943752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/65f406bd-2301-4de9-9b94-1fb285e03d6e-data-volume\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.948623 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.948598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9feb1c60-1e90-405e-9beb-753e0747aed0-metrics-certs\") pod \"network-metrics-daemon-s8svp\" (UID: \"9feb1c60-1e90-405e-9beb-753e0747aed0\") " pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:39.949112 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.949085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58f2\" (UniqueName: \"kubernetes.io/projected/65f406bd-2301-4de9-9b94-1fb285e03d6e-kube-api-access-w58f2\") pod \"insights-runtime-extractor-22kpx\" (UID: \"65f406bd-2301-4de9-9b94-1fb285e03d6e\") " pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:39.977451 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.977400 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s28sr"] Apr 22 17:34:39.981186 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:39.981159 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabb0188_2e22_40cb_b765_c6e0a5a0b030.slice/crio-06d5b80da1bc2d670af8df6ee9aa2b9c1cfc5fc415dae7244d560b3f4fc03bd6 WatchSource:0}: Error finding container 06d5b80da1bc2d670af8df6ee9aa2b9c1cfc5fc415dae7244d560b3f4fc03bd6: Status 404 returned error can't find the container with id 06d5b80da1bc2d670af8df6ee9aa2b9c1cfc5fc415dae7244d560b3f4fc03bd6 Apr 22 17:34:39.994300 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:39.994272 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vj4xl"] Apr 22 17:34:39.997042 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:39.997015 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9d86a41_acf5_4d73_ab7e_be8d3ed73a9a.slice/crio-8f016a620744e0315576cdb087e5115cd0822109c0f13aa2948b1e01b064f677 WatchSource:0}: Error finding container 8f016a620744e0315576cdb087e5115cd0822109c0f13aa2948b1e01b064f677: Status 404 returned error can't find the container with id 8f016a620744e0315576cdb087e5115cd0822109c0f13aa2948b1e01b064f677 Apr 22 17:34:40.034007 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.033957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f882c\" (UniqueName: \"kubernetes.io/projected/4cb6c6fd-89a2-40ee-b3e2-d562a853e308-kube-api-access-f882c\") pod \"downloads-6bcc868b7-sm7xt\" (UID: \"4cb6c6fd-89a2-40ee-b3e2-d562a853e308\") " pod="openshift-console/downloads-6bcc868b7-sm7xt" Apr 22 17:34:40.055935 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.055899 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jqt69\"" Apr 22 17:34:40.057059 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.057030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f882c\" (UniqueName: \"kubernetes.io/projected/4cb6c6fd-89a2-40ee-b3e2-d562a853e308-kube-api-access-f882c\") pod \"downloads-6bcc868b7-sm7xt\" (UID: \"4cb6c6fd-89a2-40ee-b3e2-d562a853e308\") " pod="openshift-console/downloads-6bcc868b7-sm7xt" Apr 22 17:34:40.062785 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.062765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s8svp" Apr 22 17:34:40.086210 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.086178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-22kpx" Apr 22 17:34:40.196918 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.196829 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-sm7xt" Apr 22 17:34:40.217183 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.217131 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s8svp"] Apr 22 17:34:40.231059 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:40.231007 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9feb1c60_1e90_405e_9beb_753e0747aed0.slice/crio-2401440533de4a48cf37b8e7939420c4ca1d863870cfff9d2177d6c3c509e066 WatchSource:0}: Error finding container 2401440533de4a48cf37b8e7939420c4ca1d863870cfff9d2177d6c3c509e066: Status 404 returned error can't find the container with id 2401440533de4a48cf37b8e7939420c4ca1d863870cfff9d2177d6c3c509e066 Apr 22 17:34:40.245782 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.245724 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-22kpx"] Apr 22 17:34:40.250374 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:40.250343 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65f406bd_2301_4de9_9b94_1fb285e03d6e.slice/crio-32ca6af88f7dcc11d4545214c3103ba5ccc8f9e691f1baaea9f0083b447a414c WatchSource:0}: Error finding container 32ca6af88f7dcc11d4545214c3103ba5ccc8f9e691f1baaea9f0083b447a414c: Status 404 returned error can't find the container with id 32ca6af88f7dcc11d4545214c3103ba5ccc8f9e691f1baaea9f0083b447a414c Apr 22 17:34:40.339993 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.339962 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-sm7xt"] Apr 22 17:34:40.343220 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:34:40.343186 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cb6c6fd_89a2_40ee_b3e2_d562a853e308.slice/crio-5fd19a014f4f1221987dad7ae1258ccecf4a5d610e614e45e61b934a94eb5f99 WatchSource:0}: Error finding container 5fd19a014f4f1221987dad7ae1258ccecf4a5d610e614e45e61b934a94eb5f99: Status 404 returned error can't find the container with id 5fd19a014f4f1221987dad7ae1258ccecf4a5d610e614e45e61b934a94eb5f99 Apr 22 17:34:40.444984 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.444893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" event={"ID":"17af67ec-8577-45de-abbb-01a7199ee7cd","Type":"ContainerStarted","Data":"a489e992e7f873fcce0dd3212dedf2c6f9bfeadb3c185aab3cdb560fb2be003c"} Apr 22 17:34:40.444984 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.444952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" event={"ID":"17af67ec-8577-45de-abbb-01a7199ee7cd","Type":"ContainerStarted","Data":"e99d5c64b57139e94afacf98592d33e6d5eb66dcb507a6d410b2cb250b43c612"} Apr 22 17:34:40.446483 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.446455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" event={"ID":"31e11e0d-c49c-4634-b336-26f608c0be83","Type":"ContainerStarted","Data":"1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277"} Apr 22 17:34:40.446611 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.446498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" event={"ID":"31e11e0d-c49c-4634-b336-26f608c0be83","Type":"ContainerStarted","Data":"054dd7308974508782b0ed3ba184e37cccbe651fb193bb89ae615b5e29e4d77a"} Apr 22 17:34:40.447686 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.447613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s8svp" event={"ID":"9feb1c60-1e90-405e-9beb-753e0747aed0","Type":"ContainerStarted","Data":"2401440533de4a48cf37b8e7939420c4ca1d863870cfff9d2177d6c3c509e066"} Apr 22 17:34:40.448957 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.448935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" event={"ID":"e9ca84a4-7331-4976-9476-b7842a9814e3","Type":"ContainerStarted","Data":"f8d6e68704847cca55290d7bc4e79b74d6b17e8c3763d39e21ee81fe7ab09f2d"} Apr 22 17:34:40.450468 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.450447 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5944f7956-vsk55" event={"ID":"84e06372-35da-4f2d-84a9-4f0537970fba","Type":"ContainerStarted","Data":"f5ebc50145fa5e011688de62dcd62f448d1e61b57fed0d431dc280b6b73c36e5"} Apr 22 17:34:40.450615 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.450597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5944f7956-vsk55" event={"ID":"84e06372-35da-4f2d-84a9-4f0537970fba","Type":"ContainerStarted","Data":"3225cbad06026067dd3de4e3280652f59060d4b836eab60b0f38bdfc42d38cac"} Apr 22 17:34:40.450728 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.450713 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:34:40.451526 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.451502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s28sr" event={"ID":"dabb0188-2e22-40cb-b765-c6e0a5a0b030","Type":"ContainerStarted","Data":"06d5b80da1bc2d670af8df6ee9aa2b9c1cfc5fc415dae7244d560b3f4fc03bd6"} Apr 22 17:34:40.452717 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.452696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" event={"ID":"59ff7442-70a1-4df1-a3a2-9eff7d027d6e","Type":"ContainerStarted","Data":"71fd9bb185bb70a1eb5a2e32688463a674556f24686f0f36531f29a9afe21e82"} Apr 22 17:34:40.453824 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.453805 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vj4xl" event={"ID":"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a","Type":"ContainerStarted","Data":"8f016a620744e0315576cdb087e5115cd0822109c0f13aa2948b1e01b064f677"} Apr 22 17:34:40.454919 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.454894 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-sm7xt" event={"ID":"4cb6c6fd-89a2-40ee-b3e2-d562a853e308","Type":"ContainerStarted","Data":"5fd19a014f4f1221987dad7ae1258ccecf4a5d610e614e45e61b934a94eb5f99"} Apr 22 17:34:40.456397 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.456375 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-22kpx" event={"ID":"65f406bd-2301-4de9-9b94-1fb285e03d6e","Type":"ContainerStarted","Data":"eb1422041ed974cd0d1f53ab0647f20d2fe410e0331717eb8bf1590a60bea0df"} Apr 22 17:34:40.456513 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.456403 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-22kpx" event={"ID":"65f406bd-2301-4de9-9b94-1fb285e03d6e","Type":"ContainerStarted","Data":"32ca6af88f7dcc11d4545214c3103ba5ccc8f9e691f1baaea9f0083b447a414c"} Apr 22 17:34:40.465153 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.465101 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" podStartSLOduration=53.465086067 podStartE2EDuration="53.465086067s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:40.463760206 +0000 UTC m=+66.931021909" watchObservedRunningTime="2026-04-22 17:34:40.465086067 +0000 UTC m=+66.932347767" Apr 22 17:34:40.489744 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.489696 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5944f7956-vsk55" podStartSLOduration=66.489680305 podStartE2EDuration="1m6.489680305s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:40.488792316 +0000 UTC m=+66.956054025" watchObservedRunningTime="2026-04-22 17:34:40.489680305 +0000 UTC m=+66.956942006" Apr 22 17:34:40.509588 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.509536 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" podStartSLOduration=53.509518917 podStartE2EDuration="53.509518917s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:40.508725994 +0000 UTC m=+66.975987697" watchObservedRunningTime="2026-04-22 17:34:40.509518917 +0000 UTC m=+66.976780618" Apr 22 17:34:40.605769 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.605732 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:40.608518 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:40.608493 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:41.462457 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:41.462383 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:41.464184 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:41.463909 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-9d9fb4b58-dpchv" Apr 22 17:34:45.475851 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.475812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s8svp" event={"ID":"9feb1c60-1e90-405e-9beb-753e0747aed0","Type":"ContainerStarted","Data":"4c9268c0de6ab153a53ce763af55a36330b0d5a79132926f2a2660fe7f244a99"} Apr 22 17:34:45.475851 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.475857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s8svp" event={"ID":"9feb1c60-1e90-405e-9beb-753e0747aed0","Type":"ContainerStarted","Data":"cceb189310284e19cbc76377c150a20ad71c88e33d06d3f6d75ae32ab8913bdb"} Apr 22 17:34:45.477983 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.477953 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" event={"ID":"e9ca84a4-7331-4976-9476-b7842a9814e3","Type":"ContainerStarted","Data":"447bc7764ff6a0b0af542f10e592808846399bbe8b9f528de7b87d323c863cf1"} Apr 22 17:34:45.479925 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.479895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s28sr" event={"ID":"dabb0188-2e22-40cb-b765-c6e0a5a0b030","Type":"ContainerStarted","Data":"7fb4437c0b212232b97e3be50321e527ad31c7d76ac2ceb3919dad1516c816c2"} Apr 22 17:34:45.481947 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.481921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" event={"ID":"59ff7442-70a1-4df1-a3a2-9eff7d027d6e","Type":"ContainerStarted","Data":"8c1c1cbf657d9d751755d2ef775ca74af682c0482685d65c7a4f94604679f4bd"} Apr 22 17:34:45.483671 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.483628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vj4xl" event={"ID":"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a","Type":"ContainerStarted","Data":"e11c907f43a4decd0efad5eeeaa0e3c75d042fb254deaab668b883d4e096549c"} Apr 22 17:34:45.483671 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.483652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vj4xl" event={"ID":"b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a","Type":"ContainerStarted","Data":"4bb104e5e0bd698651e3a00fefd0329b032d14e8982d62d7f7c417e57444d333"} Apr 22 17:34:45.483799 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.483774 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:45.485219 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.485199 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-22kpx" event={"ID":"65f406bd-2301-4de9-9b94-1fb285e03d6e","Type":"ContainerStarted","Data":"0c6c547e0f6485e53ef603c1ffb6f625d5fe370b5f60d8dcb0a9e6bfd9117a4a"} Apr 22 17:34:45.500016 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.499931 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-s8svp" podStartSLOduration=66.758530621 podStartE2EDuration="1m11.499918519s" podCreationTimestamp="2026-04-22 17:33:34 +0000 UTC" firstStartedPulling="2026-04-22 17:34:40.234131561 +0000 UTC m=+66.701393255" lastFinishedPulling="2026-04-22 17:34:44.975519471 +0000 UTC m=+71.442781153" observedRunningTime="2026-04-22 17:34:45.499576367 +0000 UTC m=+71.966838072" watchObservedRunningTime="2026-04-22 17:34:45.499918519 +0000 UTC m=+71.967180234" Apr 22 17:34:45.527338 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.527288 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vj4xl" podStartSLOduration=33.55160447 podStartE2EDuration="38.527271827s" podCreationTimestamp="2026-04-22 17:34:07 +0000 UTC" firstStartedPulling="2026-04-22 17:34:39.998991492 +0000 UTC m=+66.466253172" lastFinishedPulling="2026-04-22 17:34:44.974658835 +0000 UTC m=+71.441920529" observedRunningTime="2026-04-22 17:34:45.524953245 +0000 UTC m=+71.992214947" watchObservedRunningTime="2026-04-22 17:34:45.527271827 +0000 UTC m=+71.994533581" Apr 22 17:34:45.549817 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.549764 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vpqhw" podStartSLOduration=53.474587168 podStartE2EDuration="58.549750006s" podCreationTimestamp="2026-04-22 17:33:47 +0000 UTC" firstStartedPulling="2026-04-22 17:34:39.900308456 +0000 UTC m=+66.367570151" lastFinishedPulling="2026-04-22 17:34:44.975471296 +0000 UTC m=+71.442732989" observedRunningTime="2026-04-22 17:34:45.547394217 +0000 UTC m=+72.014655919" watchObservedRunningTime="2026-04-22 17:34:45.549750006 +0000 UTC m=+72.017011744" Apr 22 17:34:45.575844 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.575795 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s28sr" podStartSLOduration=33.578410202 podStartE2EDuration="38.575777396s" podCreationTimestamp="2026-04-22 17:34:07 +0000 UTC" firstStartedPulling="2026-04-22 17:34:39.983120954 +0000 UTC m=+66.450382636" lastFinishedPulling="2026-04-22 17:34:44.980488136 +0000 UTC m=+71.447749830" observedRunningTime="2026-04-22 17:34:45.574311132 +0000 UTC m=+72.041572834" watchObservedRunningTime="2026-04-22 17:34:45.575777396 +0000 UTC m=+72.043039097" Apr 22 17:34:45.613025 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:45.612969 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pcpz9" podStartSLOduration=48.612897216 podStartE2EDuration="53.61295431s" podCreationTimestamp="2026-04-22 17:33:52 +0000 UTC" firstStartedPulling="2026-04-22 17:34:39.951038037 +0000 UTC m=+66.418299720" lastFinishedPulling="2026-04-22 17:34:44.95109512 +0000 UTC m=+71.418356814" observedRunningTime="2026-04-22 17:34:45.612070355 +0000 UTC m=+72.079332059" watchObservedRunningTime="2026-04-22 17:34:45.61295431 +0000 UTC m=+72.080216009" Apr 22 17:34:47.362467 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:47.361823 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mh726" Apr 22 17:34:47.494336 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:47.494297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-22kpx" event={"ID":"65f406bd-2301-4de9-9b94-1fb285e03d6e","Type":"ContainerStarted","Data":"1ce9d9be9bd305296fda2e0a9ebab8f321e6d8372fcb015599443b0c4ee83e2f"} Apr 22 17:34:47.512726 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:47.512677 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-22kpx" podStartSLOduration=1.5090675569999998 podStartE2EDuration="8.512661716s" podCreationTimestamp="2026-04-22 17:34:39 +0000 UTC" firstStartedPulling="2026-04-22 17:34:40.364269443 +0000 UTC m=+66.831531124" lastFinishedPulling="2026-04-22 17:34:47.367863588 +0000 UTC m=+73.835125283" observedRunningTime="2026-04-22 17:34:47.511745685 +0000 UTC m=+73.979007390" watchObservedRunningTime="2026-04-22 17:34:47.512661716 +0000 UTC m=+73.979923417" Apr 22 17:34:50.447524 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:50.447488 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:34:55.495699 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:55.495564 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vj4xl" Apr 22 17:34:57.212225 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.212186 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8b28h"] Apr 22 17:34:57.217699 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.217672 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.224094 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.223380 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:34:57.224690 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.224312 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:34:57.224690 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.224645 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:34:57.225001 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.224905 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:34:57.225156 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.225122 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mbnkh\"" Apr 22 17:34:57.389274 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-sys\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389274 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-textfile\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389516 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bns96\" (UniqueName: \"kubernetes.io/projected/04770277-1292-4d62-8d2c-a5c19b46b73a-kube-api-access-bns96\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389516 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389516 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-root\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389516 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04770277-1292-4d62-8d2c-a5c19b46b73a-metrics-client-ca\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389516 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-tls\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389757 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-accelerators-collector-config\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.389757 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.389656 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-wtmp\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490513 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490381 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490513 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-root\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490513 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04770277-1292-4d62-8d2c-a5c19b46b73a-metrics-client-ca\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490761 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-tls\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490761 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-accelerators-collector-config\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490761 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-wtmp\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490761 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-sys\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490761 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-textfile\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.490761 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.490693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bns96\" (UniqueName: \"kubernetes.io/projected/04770277-1292-4d62-8d2c-a5c19b46b73a-kube-api-access-bns96\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.491754 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.491649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-root\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.491754 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.491673 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-sys\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.491945 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.491821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-wtmp\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.492050 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.492005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-accelerators-collector-config\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.492257 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.492208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-textfile\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.492376 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.492354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04770277-1292-4d62-8d2c-a5c19b46b73a-metrics-client-ca\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.494389 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.494164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-tls\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.494389 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.494311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04770277-1292-4d62-8d2c-a5c19b46b73a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.507761 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.507684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bns96\" (UniqueName: \"kubernetes.io/projected/04770277-1292-4d62-8d2c-a5c19b46b73a-kube-api-access-bns96\") pod \"node-exporter-8b28h\" (UID: \"04770277-1292-4d62-8d2c-a5c19b46b73a\") " pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:57.538660 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:57.538624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8b28h" Apr 22 17:34:59.577591 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:59.577547 2572 patch_prober.go:28] interesting pod/image-registry-5944f7956-vsk55 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:34:59.578157 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:34:59.577619 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5944f7956-vsk55" podUID="84e06372-35da-4f2d-84a9-4f0537970fba" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:35:00.453871 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:00.453602 2572 patch_prober.go:28] interesting pod/image-registry-69c75ddc67-c9dl5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:35:00.453871 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:00.453662 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" podUID="31e11e0d-c49c-4634-b336-26f608c0be83" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:35:01.397511 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:35:01.397471 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04770277_1292_4d62_8d2c_a5c19b46b73a.slice/crio-8969b8c97e204910c94445e1f11777cf7a9b322ac0e5b31b298225cbf5feaf3c WatchSource:0}: Error finding container 8969b8c97e204910c94445e1f11777cf7a9b322ac0e5b31b298225cbf5feaf3c: Status 404 returned error can't find the container with id 8969b8c97e204910c94445e1f11777cf7a9b322ac0e5b31b298225cbf5feaf3c Apr 22 17:35:01.466746 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:01.466717 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:35:01.539628 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:01.539596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8b28h" event={"ID":"04770277-1292-4d62-8d2c-a5c19b46b73a","Type":"ContainerStarted","Data":"8969b8c97e204910c94445e1f11777cf7a9b322ac0e5b31b298225cbf5feaf3c"} Apr 22 17:35:02.544605 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:02.544515 2572 generic.go:358] "Generic (PLEG): container finished" podID="04770277-1292-4d62-8d2c-a5c19b46b73a" containerID="96ef5a7b1dd5fa980d085c62d937663b0e256fecccb5fd3eb3da906105256c12" exitCode=0 Apr 22 17:35:02.545030 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:02.544606 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8b28h" event={"ID":"04770277-1292-4d62-8d2c-a5c19b46b73a","Type":"ContainerDied","Data":"96ef5a7b1dd5fa980d085c62d937663b0e256fecccb5fd3eb3da906105256c12"} Apr 22 17:35:02.546279 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:02.546244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-sm7xt" event={"ID":"4cb6c6fd-89a2-40ee-b3e2-d562a853e308","Type":"ContainerStarted","Data":"b897b0b88c1093891346c15135f15fb5e7ba02d01807a85d8f65e8781fcb0b13"} Apr 22 17:35:02.546477 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:02.546456 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-sm7xt" Apr 22 17:35:02.560691 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:02.560661 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-sm7xt" Apr 22 17:35:03.552101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:03.552059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8b28h" event={"ID":"04770277-1292-4d62-8d2c-a5c19b46b73a","Type":"ContainerStarted","Data":"95b13e499a9d4827fa8b14bd17cdbf66d9096cbe4347e053686fa185dcf748e7"} Apr 22 17:35:03.552101 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:03.552104 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8b28h" event={"ID":"04770277-1292-4d62-8d2c-a5c19b46b73a","Type":"ContainerStarted","Data":"09d5dbe2e2228ed44ee0397b596470cffba78b4536dca56e2fabbbb734187267"} Apr 22 17:35:03.574469 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:03.574385 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-sm7xt" podStartSLOduration=3.435071015 podStartE2EDuration="24.574370734s" podCreationTimestamp="2026-04-22 17:34:39 +0000 UTC" firstStartedPulling="2026-04-22 17:34:40.345320554 +0000 UTC m=+66.812582234" lastFinishedPulling="2026-04-22 17:35:01.484620257 +0000 UTC m=+87.951881953" observedRunningTime="2026-04-22 17:35:02.5861299 +0000 UTC m=+89.053391603" watchObservedRunningTime="2026-04-22 17:35:03.574370734 +0000 UTC m=+90.041632435" Apr 22 17:35:03.575098 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:03.575045 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8b28h" podStartSLOduration=5.712389684 podStartE2EDuration="6.575034431s" podCreationTimestamp="2026-04-22 17:34:57 +0000 UTC" firstStartedPulling="2026-04-22 17:35:01.399164951 +0000 UTC m=+87.866426631" lastFinishedPulling="2026-04-22 17:35:02.261809688 +0000 UTC m=+88.729071378" observedRunningTime="2026-04-22 17:35:03.573392903 +0000 UTC m=+90.040654630" watchObservedRunningTime="2026-04-22 17:35:03.575034431 +0000 UTC m=+90.042296184" Apr 22 17:35:05.469228 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.469158 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" podUID="31e11e0d-c49c-4634-b336-26f608c0be83" containerName="registry" containerID="cri-o://1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277" gracePeriod=30 Apr 22 17:35:05.754943 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.754911 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:35:05.864103 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864064 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtclj\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-kube-api-access-jtclj\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864296 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864169 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e11e0d-c49c-4634-b336-26f608c0be83-ca-trust-extracted\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864296 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864201 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-trusted-ca\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864296 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864242 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-registry-certificates\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864296 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864268 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-bound-sa-token\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864296 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864294 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864599 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864318 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-installation-pull-secrets\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864599 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864368 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-image-registry-private-configuration\") pod \"31e11e0d-c49c-4634-b336-26f608c0be83\" (UID: \"31e11e0d-c49c-4634-b336-26f608c0be83\") " Apr 22 17:35:05.864700 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864657 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:35:05.864700 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.864674 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:35:05.867366 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.867291 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:35:05.867366 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.867295 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:35:05.867714 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.867642 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:35:05.868463 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.867840 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:35:05.868463 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.868145 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-kube-api-access-jtclj" (OuterVolumeSpecName: "kube-api-access-jtclj") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "kube-api-access-jtclj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:35:05.876592 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.876557 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e11e0d-c49c-4634-b336-26f608c0be83-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "31e11e0d-c49c-4634-b336-26f608c0be83" (UID: "31e11e0d-c49c-4634-b336-26f608c0be83"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:35:05.965714 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965668 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-image-registry-private-configuration\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:05.965714 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965712 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtclj\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-kube-api-access-jtclj\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:05.965714 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965724 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e11e0d-c49c-4634-b336-26f608c0be83-ca-trust-extracted\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:05.965990 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965733 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-trusted-ca\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:05.965990 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965742 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e11e0d-c49c-4634-b336-26f608c0be83-registry-certificates\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:05.965990 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965750 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-bound-sa-token\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:05.965990 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965759 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e11e0d-c49c-4634-b336-26f608c0be83-registry-tls\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:05.965990 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:05.965770 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e11e0d-c49c-4634-b336-26f608c0be83-installation-pull-secrets\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:06.569538 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.569497 2572 generic.go:358] "Generic (PLEG): container finished" podID="31e11e0d-c49c-4634-b336-26f608c0be83" containerID="1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277" exitCode=0 Apr 22 17:35:06.570002 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.569579 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" Apr 22 17:35:06.570002 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.569588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" event={"ID":"31e11e0d-c49c-4634-b336-26f608c0be83","Type":"ContainerDied","Data":"1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277"} Apr 22 17:35:06.570002 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.569626 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c75ddc67-c9dl5" event={"ID":"31e11e0d-c49c-4634-b336-26f608c0be83","Type":"ContainerDied","Data":"054dd7308974508782b0ed3ba184e37cccbe651fb193bb89ae615b5e29e4d77a"} Apr 22 17:35:06.570002 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.569642 2572 scope.go:117] "RemoveContainer" containerID="1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277" Apr 22 17:35:06.578687 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.578667 2572 scope.go:117] "RemoveContainer" containerID="1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277" Apr 22 17:35:06.578986 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:35:06.578958 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277\": container with ID starting with 1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277 not found: ID does not exist" containerID="1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277" Apr 22 17:35:06.579086 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.578992 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277"} err="failed to get container status \"1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277\": rpc error: code = NotFound desc = could not find container \"1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277\": container with ID starting with 1786b16f581a6a55d6f787039367561bb4c297588bcb82881d7514721a921277 not found: ID does not exist" Apr 22 17:35:06.587259 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.587228 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69c75ddc67-c9dl5"] Apr 22 17:35:06.592650 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:06.592626 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69c75ddc67-c9dl5"] Apr 22 17:35:08.129346 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:08.129304 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e11e0d-c49c-4634-b336-26f608c0be83" path="/var/lib/kubelet/pods/31e11e0d-c49c-4634-b336-26f608c0be83/volumes" Apr 22 17:35:08.374479 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:08.374376 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5944f7956-vsk55"] Apr 22 17:35:21.617589 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:21.617500 2572 generic.go:358] "Generic (PLEG): container finished" podID="2dd100f6-0060-426d-9cf7-a7f9fafa003a" containerID="dd2e1785346a61fe4c5b4a25ba44e7d9b528e3b5ee1c2bd7c11bcbc56e77e236" exitCode=0 Apr 22 17:35:21.617589 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:21.617574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rk27" event={"ID":"2dd100f6-0060-426d-9cf7-a7f9fafa003a","Type":"ContainerDied","Data":"dd2e1785346a61fe4c5b4a25ba44e7d9b528e3b5ee1c2bd7c11bcbc56e77e236"} Apr 22 17:35:21.618004 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:21.617899 2572 scope.go:117] "RemoveContainer" containerID="dd2e1785346a61fe4c5b4a25ba44e7d9b528e3b5ee1c2bd7c11bcbc56e77e236" Apr 22 17:35:22.547208 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:22.547149 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-9d9fb4b58-dpchv_17af67ec-8577-45de-abbb-01a7199ee7cd/router/0.log" Apr 22 17:35:22.572210 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:22.572179 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s28sr_dabb0188-2e22-40cb-b765-c6e0a5a0b030/serve-healthcheck-canary/0.log" Apr 22 17:35:22.622526 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:22.622491 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rk27" event={"ID":"2dd100f6-0060-426d-9cf7-a7f9fafa003a","Type":"ContainerStarted","Data":"e95ec5af53a3993b412e753efa044a8a226702c6d2c63fd2ebc6a33ba4076c27"} Apr 22 17:35:26.635276 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:26.635181 2572 generic.go:358] "Generic (PLEG): container finished" podID="346331ec-1cea-46ea-8952-6af403c257c0" containerID="7167e4fa3954dac7a1d0036ad421e79895b70cd6b2f1999466a60010488d81ed" exitCode=0 Apr 22 17:35:26.635276 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:26.635237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" event={"ID":"346331ec-1cea-46ea-8952-6af403c257c0","Type":"ContainerDied","Data":"7167e4fa3954dac7a1d0036ad421e79895b70cd6b2f1999466a60010488d81ed"} Apr 22 17:35:26.635739 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:26.635539 2572 scope.go:117] "RemoveContainer" containerID="7167e4fa3954dac7a1d0036ad421e79895b70cd6b2f1999466a60010488d81ed" Apr 22 17:35:27.639942 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:27.639902 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6bzf9" event={"ID":"346331ec-1cea-46ea-8952-6af403c257c0","Type":"ContainerStarted","Data":"eca93f9ea444f9d2e7c0fee90678224554476a5cc5001df9d6c7772cf422bdba"} Apr 22 17:35:33.396162 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.396102 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5944f7956-vsk55" podUID="84e06372-35da-4f2d-84a9-4f0537970fba" containerName="registry" containerID="cri-o://f5ebc50145fa5e011688de62dcd62f448d1e61b57fed0d431dc280b6b73c36e5" gracePeriod=30 Apr 22 17:35:33.657396 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.657359 2572 generic.go:358] "Generic (PLEG): container finished" podID="84e06372-35da-4f2d-84a9-4f0537970fba" containerID="f5ebc50145fa5e011688de62dcd62f448d1e61b57fed0d431dc280b6b73c36e5" exitCode=0 Apr 22 17:35:33.657553 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.657437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5944f7956-vsk55" event={"ID":"84e06372-35da-4f2d-84a9-4f0537970fba","Type":"ContainerDied","Data":"f5ebc50145fa5e011688de62dcd62f448d1e61b57fed0d431dc280b6b73c36e5"} Apr 22 17:35:33.657553 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.657475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5944f7956-vsk55" event={"ID":"84e06372-35da-4f2d-84a9-4f0537970fba","Type":"ContainerDied","Data":"3225cbad06026067dd3de4e3280652f59060d4b836eab60b0f38bdfc42d38cac"} Apr 22 17:35:33.657553 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.657488 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3225cbad06026067dd3de4e3280652f59060d4b836eab60b0f38bdfc42d38cac" Apr 22 17:35:33.670504 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.670479 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:35:33.818375 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818341 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84e06372-35da-4f2d-84a9-4f0537970fba-ca-trust-extracted\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.818609 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818404 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr9rz\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-kube-api-access-xr9rz\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.818609 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818480 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-installation-pull-secrets\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.818609 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818522 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-image-registry-private-configuration\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.818609 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818541 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-trusted-ca\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.818609 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818556 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-registry-certificates\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.818609 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818573 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-bound-sa-token\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.818609 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818599 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") pod \"84e06372-35da-4f2d-84a9-4f0537970fba\" (UID: \"84e06372-35da-4f2d-84a9-4f0537970fba\") " Apr 22 17:35:33.819048 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:35:33.819048 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.818987 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:35:33.821026 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.820992 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:35:33.821151 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.821000 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:35:33.821151 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.821125 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-kube-api-access-xr9rz" (OuterVolumeSpecName: "kube-api-access-xr9rz") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "kube-api-access-xr9rz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:35:33.821271 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.821232 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:35:33.821323 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.821288 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:35:33.831301 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.831267 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e06372-35da-4f2d-84a9-4f0537970fba-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "84e06372-35da-4f2d-84a9-4f0537970fba" (UID: "84e06372-35da-4f2d-84a9-4f0537970fba"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:35:33.919508 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919396 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84e06372-35da-4f2d-84a9-4f0537970fba-ca-trust-extracted\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:33.919508 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919457 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xr9rz\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-kube-api-access-xr9rz\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:33.919508 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919472 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-installation-pull-secrets\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:33.919508 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919487 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84e06372-35da-4f2d-84a9-4f0537970fba-image-registry-private-configuration\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:33.919508 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919502 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-trusted-ca\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:33.919805 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919514 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84e06372-35da-4f2d-84a9-4f0537970fba-registry-certificates\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:33.919805 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919527 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-bound-sa-token\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:33.919805 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:33.919538 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84e06372-35da-4f2d-84a9-4f0537970fba-registry-tls\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:35:34.662220 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:34.662185 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5944f7956-vsk55" Apr 22 17:35:34.683954 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:34.683918 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5944f7956-vsk55"] Apr 22 17:35:34.687250 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:34.687212 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5944f7956-vsk55"] Apr 22 17:35:36.128184 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:36.128141 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e06372-35da-4f2d-84a9-4f0537970fba" path="/var/lib/kubelet/pods/84e06372-35da-4f2d-84a9-4f0537970fba/volumes" Apr 22 17:35:41.690275 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:41.690241 2572 generic.go:358] "Generic (PLEG): container finished" podID="7526861b-0afa-4db9-9077-282b6ab524f3" containerID="3b5944634ca4c32043fe3e691f913f44425bc9224e6b4c239efdaa89f9b45eb4" exitCode=0 Apr 22 17:35:41.690732 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:41.690301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" event={"ID":"7526861b-0afa-4db9-9077-282b6ab524f3","Type":"ContainerDied","Data":"3b5944634ca4c32043fe3e691f913f44425bc9224e6b4c239efdaa89f9b45eb4"} Apr 22 17:35:41.690732 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:41.690628 2572 scope.go:117] "RemoveContainer" containerID="3b5944634ca4c32043fe3e691f913f44425bc9224e6b4c239efdaa89f9b45eb4" Apr 22 17:35:42.698618 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:35:42.698577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8cqtb" event={"ID":"7526861b-0afa-4db9-9077-282b6ab524f3","Type":"ContainerStarted","Data":"40863f2ac7d86012b7b4ac81be599c926c2dca569a78628005c1f51e026a0c90"} Apr 22 17:38:09.065944 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.065864 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7"] Apr 22 17:38:09.066388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.066154 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84e06372-35da-4f2d-84a9-4f0537970fba" containerName="registry" Apr 22 17:38:09.066388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.066164 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e06372-35da-4f2d-84a9-4f0537970fba" containerName="registry" Apr 22 17:38:09.066388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.066174 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31e11e0d-c49c-4634-b336-26f608c0be83" containerName="registry" Apr 22 17:38:09.066388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.066179 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e11e0d-c49c-4634-b336-26f608c0be83" containerName="registry" Apr 22 17:38:09.066388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.066227 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="84e06372-35da-4f2d-84a9-4f0537970fba" containerName="registry" Apr 22 17:38:09.066388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.066235 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="31e11e0d-c49c-4634-b336-26f608c0be83" containerName="registry" Apr 22 17:38:09.069202 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.069185 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.072267 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.072241 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 17:38:09.072500 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.072487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-frh6k\"" Apr 22 17:38:09.073104 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.073086 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 17:38:09.080045 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.080022 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7"] Apr 22 17:38:09.193078 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.193042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj64q\" (UniqueName: \"kubernetes.io/projected/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-kube-api-access-vj64q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.193269 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.193082 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.193269 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.193119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.294112 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.294075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj64q\" (UniqueName: \"kubernetes.io/projected/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-kube-api-access-vj64q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.294112 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.294115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.294370 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.294139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.294501 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.294484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.294543 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.294512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.302832 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.302804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj64q\" (UniqueName: \"kubernetes.io/projected/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-kube-api-access-vj64q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.377938 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.377831 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:09.502581 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:09.502553 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7"] Apr 22 17:38:09.505023 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:38:09.504993 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2a82e7_b9ab_4d31_8a5c_72f642de0c45.slice/crio-b92dcd1668e998a7ad966961e0c5afbec71794402190ed5e6d31702f51ad8328 WatchSource:0}: Error finding container b92dcd1668e998a7ad966961e0c5afbec71794402190ed5e6d31702f51ad8328: Status 404 returned error can't find the container with id b92dcd1668e998a7ad966961e0c5afbec71794402190ed5e6d31702f51ad8328 Apr 22 17:38:10.114916 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:10.114868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" event={"ID":"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45","Type":"ContainerStarted","Data":"b92dcd1668e998a7ad966961e0c5afbec71794402190ed5e6d31702f51ad8328"} Apr 22 17:38:15.131647 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:15.131602 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerID="a3b7c0e23251f3fb3af85c42648abc5c1b101b8b5d883654279ef038152ec10e" exitCode=0 Apr 22 17:38:15.132070 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:15.131686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" event={"ID":"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45","Type":"ContainerDied","Data":"a3b7c0e23251f3fb3af85c42648abc5c1b101b8b5d883654279ef038152ec10e"} Apr 22 17:38:17.139255 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:17.139225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" event={"ID":"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45","Type":"ContainerStarted","Data":"f3370550116d9ad04383cd5c58b053d71352e87b0b6d320864f39b493ff2d4e0"} Apr 22 17:38:18.142916 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:18.142879 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerID="f3370550116d9ad04383cd5c58b053d71352e87b0b6d320864f39b493ff2d4e0" exitCode=0 Apr 22 17:38:18.143296 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:18.142956 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" event={"ID":"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45","Type":"ContainerDied","Data":"f3370550116d9ad04383cd5c58b053d71352e87b0b6d320864f39b493ff2d4e0"} Apr 22 17:38:27.171724 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:27.171680 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerID="b6623e21fa98f584d83e67c764eae317c1630f021fba70b287ae7d0116b16fe0" exitCode=0 Apr 22 17:38:27.172092 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:27.171754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" event={"ID":"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45","Type":"ContainerDied","Data":"b6623e21fa98f584d83e67c764eae317c1630f021fba70b287ae7d0116b16fe0"} Apr 22 17:38:28.288547 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.288520 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:28.454345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.454241 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj64q\" (UniqueName: \"kubernetes.io/projected/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-kube-api-access-vj64q\") pod \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " Apr 22 17:38:28.454345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.454318 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-util\") pod \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " Apr 22 17:38:28.454345 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.454345 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-bundle\") pod \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\" (UID: \"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45\") " Apr 22 17:38:28.455060 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.455025 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-bundle" (OuterVolumeSpecName: "bundle") pod "fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" (UID: "fa2a82e7-b9ab-4d31-8a5c-72f642de0c45"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:28.456589 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.456569 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-kube-api-access-vj64q" (OuterVolumeSpecName: "kube-api-access-vj64q") pod "fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" (UID: "fa2a82e7-b9ab-4d31-8a5c-72f642de0c45"). InnerVolumeSpecName "kube-api-access-vj64q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:28.460478 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.460453 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-util" (OuterVolumeSpecName: "util") pod "fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" (UID: "fa2a82e7-b9ab-4d31-8a5c-72f642de0c45"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:28.555752 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.555706 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vj64q\" (UniqueName: \"kubernetes.io/projected/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-kube-api-access-vj64q\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.555752 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.555743 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-util\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:38:28.555752 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:28.555754 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa2a82e7-b9ab-4d31-8a5c-72f642de0c45-bundle\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.178916 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:29.178876 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" event={"ID":"fa2a82e7-b9ab-4d31-8a5c-72f642de0c45","Type":"ContainerDied","Data":"b92dcd1668e998a7ad966961e0c5afbec71794402190ed5e6d31702f51ad8328"} Apr 22 17:38:29.178916 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:29.178909 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cnt8p7" Apr 22 17:38:29.178916 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:29.178917 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b92dcd1668e998a7ad966961e0c5afbec71794402190ed5e6d31702f51ad8328" Apr 22 17:38:30.832294 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832253 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l"] Apr 22 17:38:30.832731 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832581 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerName="util" Apr 22 17:38:30.832731 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832594 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerName="util" Apr 22 17:38:30.832731 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832603 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerName="pull" Apr 22 17:38:30.832731 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832609 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerName="pull" Apr 22 17:38:30.832731 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832616 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerName="extract" Apr 22 17:38:30.832731 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832622 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerName="extract" Apr 22 17:38:30.832731 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.832677 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa2a82e7-b9ab-4d31-8a5c-72f642de0c45" containerName="extract" Apr 22 17:38:30.878700 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.878663 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l"] Apr 22 17:38:30.878851 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.878780 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:30.881570 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.881548 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 17:38:30.881714 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.881633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 17:38:30.881766 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.881750 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-7qrx5\"" Apr 22 17:38:30.881826 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.881808 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 17:38:30.976660 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.976627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fr64l\" (UID: \"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:30.976834 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:30.976682 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glkv2\" (UniqueName: \"kubernetes.io/projected/bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b-kube-api-access-glkv2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fr64l\" (UID: \"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:31.077459 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:31.077393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fr64l\" (UID: \"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:31.077630 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:31.077478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glkv2\" (UniqueName: \"kubernetes.io/projected/bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b-kube-api-access-glkv2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fr64l\" (UID: \"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:31.079900 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:31.079880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fr64l\" (UID: \"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:31.086909 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:31.086845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glkv2\" (UniqueName: \"kubernetes.io/projected/bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b-kube-api-access-glkv2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fr64l\" (UID: \"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:31.188820 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:31.188785 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:31.315440 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:31.315391 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l"] Apr 22 17:38:31.318826 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:38:31.318795 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb190ab_cc35_49ed_a6e2_5f7c9d11d81b.slice/crio-cc7115005097bdd68aa72e83183775b9762b5c48acac1afc7c47a20bec6c7bf2 WatchSource:0}: Error finding container cc7115005097bdd68aa72e83183775b9762b5c48acac1afc7c47a20bec6c7bf2: Status 404 returned error can't find the container with id cc7115005097bdd68aa72e83183775b9762b5c48acac1afc7c47a20bec6c7bf2 Apr 22 17:38:32.188843 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:32.188809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" event={"ID":"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b","Type":"ContainerStarted","Data":"cc7115005097bdd68aa72e83183775b9762b5c48acac1afc7c47a20bec6c7bf2"} Apr 22 17:38:35.233617 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:35.233589 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:38:35.253306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:35.234389 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:38:35.253306 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:35.241583 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:38:36.201265 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.201224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" event={"ID":"bdb190ab-cc35-49ed-a6e2-5f7c9d11d81b","Type":"ContainerStarted","Data":"59faaaea3f754bc6b50d33972c32206aa230bb7b6d2efd8d63a906a4b2b25672"} Apr 22 17:38:36.201465 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.201350 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:36.241852 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.241802 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" podStartSLOduration=1.9180280189999999 podStartE2EDuration="6.241785957s" podCreationTimestamp="2026-04-22 17:38:30 +0000 UTC" firstStartedPulling="2026-04-22 17:38:31.320603349 +0000 UTC m=+297.787865029" lastFinishedPulling="2026-04-22 17:38:35.644361283 +0000 UTC m=+302.111622967" observedRunningTime="2026-04-22 17:38:36.24101765 +0000 UTC m=+302.708279356" watchObservedRunningTime="2026-04-22 17:38:36.241785957 +0000 UTC m=+302.709047658" Apr 22 17:38:36.677181 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.677143 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6"] Apr 22 17:38:36.680649 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.680626 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.684626 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.684598 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 17:38:36.684767 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.684654 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 17:38:36.684767 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.684608 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-f77fc\"" Apr 22 17:38:36.696878 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.696845 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6"] Apr 22 17:38:36.720410 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.720378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xhjp\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-kube-api-access-8xhjp\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.720613 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.720451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.720613 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.720498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/030bcd7a-3c12-4b91-a924-e8d994a982cc-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.821139 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.821101 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xhjp\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-kube-api-access-8xhjp\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.821327 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.821156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.821327 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.821180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/030bcd7a-3c12-4b91-a924-e8d994a982cc-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.821415 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:36.821322 2572 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:38:36.821415 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:36.821346 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:38:36.821415 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:36.821363 2572 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 17:38:36.821415 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:36.821388 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 17:38:36.821588 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:36.821504 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates podName:030bcd7a-3c12-4b91-a924-e8d994a982cc nodeName:}" failed. No retries permitted until 2026-04-22 17:38:37.321479805 +0000 UTC m=+303.788741485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates") pod "keda-metrics-apiserver-7c9f485588-fj6b6" (UID: "030bcd7a-3c12-4b91-a924-e8d994a982cc") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 17:38:36.821588 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.821535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/030bcd7a-3c12-4b91-a924-e8d994a982cc-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:36.832468 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:36.832445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xhjp\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-kube-api-access-8xhjp\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:37.326559 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:37.326512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:37.326971 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:37.326648 2572 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:38:37.326971 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:37.326670 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:38:37.326971 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:37.326689 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6: references non-existent secret key: tls.crt Apr 22 17:38:37.326971 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:37.326750 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates podName:030bcd7a-3c12-4b91-a924-e8d994a982cc nodeName:}" failed. No retries permitted until 2026-04-22 17:38:38.326736934 +0000 UTC m=+304.793998613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates") pod "keda-metrics-apiserver-7c9f485588-fj6b6" (UID: "030bcd7a-3c12-4b91-a924-e8d994a982cc") : references non-existent secret key: tls.crt Apr 22 17:38:38.334590 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:38.334553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:38.334954 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:38.334690 2572 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:38:38.334954 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:38.334705 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:38:38.334954 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:38.334722 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6: references non-existent secret key: tls.crt Apr 22 17:38:38.334954 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:38.334783 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates podName:030bcd7a-3c12-4b91-a924-e8d994a982cc nodeName:}" failed. No retries permitted until 2026-04-22 17:38:40.334766833 +0000 UTC m=+306.802028516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates") pod "keda-metrics-apiserver-7c9f485588-fj6b6" (UID: "030bcd7a-3c12-4b91-a924-e8d994a982cc") : references non-existent secret key: tls.crt Apr 22 17:38:40.348292 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:40.348253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:40.348806 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:40.348394 2572 secret.go:281] references non-existent secret key: tls.crt Apr 22 17:38:40.348806 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:40.348440 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 17:38:40.348806 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:40.348458 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6: references non-existent secret key: tls.crt Apr 22 17:38:40.348806 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:38:40.348509 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates podName:030bcd7a-3c12-4b91-a924-e8d994a982cc nodeName:}" failed. No retries permitted until 2026-04-22 17:38:44.348495645 +0000 UTC m=+310.815757324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates") pod "keda-metrics-apiserver-7c9f485588-fj6b6" (UID: "030bcd7a-3c12-4b91-a924-e8d994a982cc") : references non-existent secret key: tls.crt Apr 22 17:38:44.380596 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:44.380548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:44.383173 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:44.383143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/030bcd7a-3c12-4b91-a924-e8d994a982cc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fj6b6\" (UID: \"030bcd7a-3c12-4b91-a924-e8d994a982cc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:44.499490 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:44.499451 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:44.618677 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:44.618646 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6"] Apr 22 17:38:44.620725 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:38:44.620686 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod030bcd7a_3c12_4b91_a924_e8d994a982cc.slice/crio-0afe9c132e33f95d45bbba5b8ee8fe5750f7e9b3ea5823fd8a40b6b30da074dc WatchSource:0}: Error finding container 0afe9c132e33f95d45bbba5b8ee8fe5750f7e9b3ea5823fd8a40b6b30da074dc: Status 404 returned error can't find the container with id 0afe9c132e33f95d45bbba5b8ee8fe5750f7e9b3ea5823fd8a40b6b30da074dc Apr 22 17:38:44.621993 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:44.621973 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:38:45.231175 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:45.231131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" event={"ID":"030bcd7a-3c12-4b91-a924-e8d994a982cc","Type":"ContainerStarted","Data":"0afe9c132e33f95d45bbba5b8ee8fe5750f7e9b3ea5823fd8a40b6b30da074dc"} Apr 22 17:38:48.243100 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:48.243059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" event={"ID":"030bcd7a-3c12-4b91-a924-e8d994a982cc","Type":"ContainerStarted","Data":"59955e5f1c74f730032facb551c4c1c6816bedc41884892fb090374dcafebea8"} Apr 22 17:38:48.243507 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:48.243122 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:38:48.259019 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:48.258957 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" podStartSLOduration=9.3129102 podStartE2EDuration="12.258937653s" podCreationTimestamp="2026-04-22 17:38:36 +0000 UTC" firstStartedPulling="2026-04-22 17:38:44.62211252 +0000 UTC m=+311.089374199" lastFinishedPulling="2026-04-22 17:38:47.56813997 +0000 UTC m=+314.035401652" observedRunningTime="2026-04-22 17:38:48.25800849 +0000 UTC m=+314.725270191" watchObservedRunningTime="2026-04-22 17:38:48.258937653 +0000 UTC m=+314.726199355" Apr 22 17:38:57.207344 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:57.207312 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fr64l" Apr 22 17:38:59.252524 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:38:59.252493 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fj6b6" Apr 22 17:39:43.291004 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.290913 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-g4bz7"] Apr 22 17:39:43.294316 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.294293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.296931 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.296900 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 17:39:43.297911 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.297890 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 17:39:43.298007 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.297908 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 17:39:43.298007 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.297890 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-flxsm\"" Apr 22 17:39:43.302267 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.302242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-g4bz7"] Apr 22 17:39:43.323398 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.323369 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-7hstq"] Apr 22 17:39:43.326676 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.326655 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.328912 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.328890 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 17:39:43.329029 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.328945 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9x9jd\"" Apr 22 17:39:43.335524 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.335501 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7hstq"] Apr 22 17:39:43.440131 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.440097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d5d35495-0d87-4b0e-9315-49e44437afd4-data\") pod \"seaweedfs-86cc847c5c-7hstq\" (UID: \"d5d35495-0d87-4b0e-9315-49e44437afd4\") " pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.440317 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.440141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ea70342-b27f-45c9-9ae6-844509bd7bd9-cert\") pod \"kserve-controller-manager-84ffddfb66-g4bz7\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.440317 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.440168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4tj7\" (UniqueName: \"kubernetes.io/projected/3ea70342-b27f-45c9-9ae6-844509bd7bd9-kube-api-access-c4tj7\") pod \"kserve-controller-manager-84ffddfb66-g4bz7\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.440317 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.440239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzt4\" (UniqueName: \"kubernetes.io/projected/d5d35495-0d87-4b0e-9315-49e44437afd4-kube-api-access-tnzt4\") pod \"seaweedfs-86cc847c5c-7hstq\" (UID: \"d5d35495-0d87-4b0e-9315-49e44437afd4\") " pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.541565 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.541481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzt4\" (UniqueName: \"kubernetes.io/projected/d5d35495-0d87-4b0e-9315-49e44437afd4-kube-api-access-tnzt4\") pod \"seaweedfs-86cc847c5c-7hstq\" (UID: \"d5d35495-0d87-4b0e-9315-49e44437afd4\") " pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.541741 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.541570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d5d35495-0d87-4b0e-9315-49e44437afd4-data\") pod \"seaweedfs-86cc847c5c-7hstq\" (UID: \"d5d35495-0d87-4b0e-9315-49e44437afd4\") " pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.541741 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.541609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ea70342-b27f-45c9-9ae6-844509bd7bd9-cert\") pod \"kserve-controller-manager-84ffddfb66-g4bz7\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.541741 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.541633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4tj7\" (UniqueName: \"kubernetes.io/projected/3ea70342-b27f-45c9-9ae6-844509bd7bd9-kube-api-access-c4tj7\") pod \"kserve-controller-manager-84ffddfb66-g4bz7\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.542008 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.541973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d5d35495-0d87-4b0e-9315-49e44437afd4-data\") pod \"seaweedfs-86cc847c5c-7hstq\" (UID: \"d5d35495-0d87-4b0e-9315-49e44437afd4\") " pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.544210 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.544187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ea70342-b27f-45c9-9ae6-844509bd7bd9-cert\") pod \"kserve-controller-manager-84ffddfb66-g4bz7\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.562216 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.562188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4tj7\" (UniqueName: \"kubernetes.io/projected/3ea70342-b27f-45c9-9ae6-844509bd7bd9-kube-api-access-c4tj7\") pod \"kserve-controller-manager-84ffddfb66-g4bz7\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.565153 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.565129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzt4\" (UniqueName: \"kubernetes.io/projected/d5d35495-0d87-4b0e-9315-49e44437afd4-kube-api-access-tnzt4\") pod \"seaweedfs-86cc847c5c-7hstq\" (UID: \"d5d35495-0d87-4b0e-9315-49e44437afd4\") " pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.607185 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.607148 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:43.637229 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.637190 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:43.752626 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.752505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-g4bz7"] Apr 22 17:39:43.755397 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:39:43.755368 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ea70342_b27f_45c9_9ae6_844509bd7bd9.slice/crio-1f14fcee4ca01db307301e6414682d5d4f100b6391af3808d9a8e94f10590ba5 WatchSource:0}: Error finding container 1f14fcee4ca01db307301e6414682d5d4f100b6391af3808d9a8e94f10590ba5: Status 404 returned error can't find the container with id 1f14fcee4ca01db307301e6414682d5d4f100b6391af3808d9a8e94f10590ba5 Apr 22 17:39:43.777610 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:43.777587 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7hstq"] Apr 22 17:39:43.779114 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:39:43.779082 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d35495_0d87_4b0e_9315_49e44437afd4.slice/crio-b6cb75a79e3db657d62acaec2e0ee74a912eacc5b80e37abf9df0438d7910337 WatchSource:0}: Error finding container b6cb75a79e3db657d62acaec2e0ee74a912eacc5b80e37abf9df0438d7910337: Status 404 returned error can't find the container with id b6cb75a79e3db657d62acaec2e0ee74a912eacc5b80e37abf9df0438d7910337 Apr 22 17:39:44.419318 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:44.419272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7hstq" event={"ID":"d5d35495-0d87-4b0e-9315-49e44437afd4","Type":"ContainerStarted","Data":"b6cb75a79e3db657d62acaec2e0ee74a912eacc5b80e37abf9df0438d7910337"} Apr 22 17:39:44.420339 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:44.420311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" event={"ID":"3ea70342-b27f-45c9-9ae6-844509bd7bd9","Type":"ContainerStarted","Data":"1f14fcee4ca01db307301e6414682d5d4f100b6391af3808d9a8e94f10590ba5"} Apr 22 17:39:48.437372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:48.437336 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7hstq" event={"ID":"d5d35495-0d87-4b0e-9315-49e44437afd4","Type":"ContainerStarted","Data":"b8534b1d72a3ce4edd9a8e53162c626db861cf54f92a10a6c9d7f597285758f9"} Apr 22 17:39:48.437832 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:48.437452 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:39:48.438701 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:48.438674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" event={"ID":"3ea70342-b27f-45c9-9ae6-844509bd7bd9","Type":"ContainerStarted","Data":"54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70"} Apr 22 17:39:48.438816 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:48.438743 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:39:48.454770 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:48.454726 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-7hstq" podStartSLOduration=1.586374374 podStartE2EDuration="5.454713198s" podCreationTimestamp="2026-04-22 17:39:43 +0000 UTC" firstStartedPulling="2026-04-22 17:39:43.780348727 +0000 UTC m=+370.247610407" lastFinishedPulling="2026-04-22 17:39:47.648687541 +0000 UTC m=+374.115949231" observedRunningTime="2026-04-22 17:39:48.453356051 +0000 UTC m=+374.920617774" watchObservedRunningTime="2026-04-22 17:39:48.454713198 +0000 UTC m=+374.921974899" Apr 22 17:39:48.469954 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:48.469893 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" podStartSLOduration=1.633420971 podStartE2EDuration="5.469877574s" podCreationTimestamp="2026-04-22 17:39:43 +0000 UTC" firstStartedPulling="2026-04-22 17:39:43.756636614 +0000 UTC m=+370.223898298" lastFinishedPulling="2026-04-22 17:39:47.593093222 +0000 UTC m=+374.060354901" observedRunningTime="2026-04-22 17:39:48.469031054 +0000 UTC m=+374.936292750" watchObservedRunningTime="2026-04-22 17:39:48.469877574 +0000 UTC m=+374.937139277" Apr 22 17:39:54.444451 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:39:54.444401 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-7hstq" Apr 22 17:40:18.838083 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.838046 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-g4bz7"] Apr 22 17:40:18.838616 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.838274 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" podUID="3ea70342-b27f-45c9-9ae6-844509bd7bd9" containerName="manager" containerID="cri-o://54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70" gracePeriod=10 Apr 22 17:40:18.843371 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.843338 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:40:18.860035 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.860007 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-787ll"] Apr 22 17:40:18.863388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.863367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:18.873741 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.873716 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-787ll"] Apr 22 17:40:18.930063 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.930027 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg24t\" (UniqueName: \"kubernetes.io/projected/41a25f1a-672b-44a3-be6c-dd53a77e74e2-kube-api-access-bg24t\") pod \"kserve-controller-manager-84ffddfb66-787ll\" (UID: \"41a25f1a-672b-44a3-be6c-dd53a77e74e2\") " pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:18.930219 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:18.930089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41a25f1a-672b-44a3-be6c-dd53a77e74e2-cert\") pod \"kserve-controller-manager-84ffddfb66-787ll\" (UID: \"41a25f1a-672b-44a3-be6c-dd53a77e74e2\") " pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:19.031132 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.031099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41a25f1a-672b-44a3-be6c-dd53a77e74e2-cert\") pod \"kserve-controller-manager-84ffddfb66-787ll\" (UID: \"41a25f1a-672b-44a3-be6c-dd53a77e74e2\") " pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:19.031289 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.031170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bg24t\" (UniqueName: \"kubernetes.io/projected/41a25f1a-672b-44a3-be6c-dd53a77e74e2-kube-api-access-bg24t\") pod \"kserve-controller-manager-84ffddfb66-787ll\" (UID: \"41a25f1a-672b-44a3-be6c-dd53a77e74e2\") " pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:19.033741 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.033711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41a25f1a-672b-44a3-be6c-dd53a77e74e2-cert\") pod \"kserve-controller-manager-84ffddfb66-787ll\" (UID: \"41a25f1a-672b-44a3-be6c-dd53a77e74e2\") " pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:19.039989 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.039954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg24t\" (UniqueName: \"kubernetes.io/projected/41a25f1a-672b-44a3-be6c-dd53a77e74e2-kube-api-access-bg24t\") pod \"kserve-controller-manager-84ffddfb66-787ll\" (UID: \"41a25f1a-672b-44a3-be6c-dd53a77e74e2\") " pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:19.072956 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.072926 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:40:19.132232 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.132140 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4tj7\" (UniqueName: \"kubernetes.io/projected/3ea70342-b27f-45c9-9ae6-844509bd7bd9-kube-api-access-c4tj7\") pod \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " Apr 22 17:40:19.132232 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.132211 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ea70342-b27f-45c9-9ae6-844509bd7bd9-cert\") pod \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\" (UID: \"3ea70342-b27f-45c9-9ae6-844509bd7bd9\") " Apr 22 17:40:19.134454 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.134401 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ea70342-b27f-45c9-9ae6-844509bd7bd9-kube-api-access-c4tj7" (OuterVolumeSpecName: "kube-api-access-c4tj7") pod "3ea70342-b27f-45c9-9ae6-844509bd7bd9" (UID: "3ea70342-b27f-45c9-9ae6-844509bd7bd9"). InnerVolumeSpecName "kube-api-access-c4tj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:40:19.134454 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.134407 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ea70342-b27f-45c9-9ae6-844509bd7bd9-cert" (OuterVolumeSpecName: "cert") pod "3ea70342-b27f-45c9-9ae6-844509bd7bd9" (UID: "3ea70342-b27f-45c9-9ae6-844509bd7bd9"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:40:19.219765 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.219731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:19.233681 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.233651 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c4tj7\" (UniqueName: \"kubernetes.io/projected/3ea70342-b27f-45c9-9ae6-844509bd7bd9-kube-api-access-c4tj7\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:40:19.233681 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.233681 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ea70342-b27f-45c9-9ae6-844509bd7bd9-cert\") on node \"ip-10-0-131-22.ec2.internal\" DevicePath \"\"" Apr 22 17:40:19.347777 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.347745 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-787ll"] Apr 22 17:40:19.350878 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:40:19.350853 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a25f1a_672b_44a3_be6c_dd53a77e74e2.slice/crio-caf31044d7d42d2dab2bc983160bbc2d9b2ce542123aaba53bea23727e2a2e46 WatchSource:0}: Error finding container caf31044d7d42d2dab2bc983160bbc2d9b2ce542123aaba53bea23727e2a2e46: Status 404 returned error can't find the container with id caf31044d7d42d2dab2bc983160bbc2d9b2ce542123aaba53bea23727e2a2e46 Apr 22 17:40:19.543096 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.543058 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ea70342-b27f-45c9-9ae6-844509bd7bd9" containerID="54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70" exitCode=0 Apr 22 17:40:19.543277 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.543127 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" Apr 22 17:40:19.543277 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.543146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" event={"ID":"3ea70342-b27f-45c9-9ae6-844509bd7bd9","Type":"ContainerDied","Data":"54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70"} Apr 22 17:40:19.543277 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.543195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-g4bz7" event={"ID":"3ea70342-b27f-45c9-9ae6-844509bd7bd9","Type":"ContainerDied","Data":"1f14fcee4ca01db307301e6414682d5d4f100b6391af3808d9a8e94f10590ba5"} Apr 22 17:40:19.543277 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.543212 2572 scope.go:117] "RemoveContainer" containerID="54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70" Apr 22 17:40:19.544277 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.544257 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-787ll" event={"ID":"41a25f1a-672b-44a3-be6c-dd53a77e74e2","Type":"ContainerStarted","Data":"caf31044d7d42d2dab2bc983160bbc2d9b2ce542123aaba53bea23727e2a2e46"} Apr 22 17:40:19.552167 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.552149 2572 scope.go:117] "RemoveContainer" containerID="54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70" Apr 22 17:40:19.552454 ip-10-0-131-22 kubenswrapper[2572]: E0422 17:40:19.552413 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70\": container with ID starting with 54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70 not found: ID does not exist" containerID="54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70" Apr 22 17:40:19.552528 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.552461 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70"} err="failed to get container status \"54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70\": rpc error: code = NotFound desc = could not find container \"54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70\": container with ID starting with 54cfcb1bf3bae951fc1e6a3779a8b80fa4a6856354920944b3ff46648ac3bd70 not found: ID does not exist" Apr 22 17:40:19.563406 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.563377 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-g4bz7"] Apr 22 17:40:19.569179 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:19.569155 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-g4bz7"] Apr 22 17:40:20.128134 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:20.128060 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ea70342-b27f-45c9-9ae6-844509bd7bd9" path="/var/lib/kubelet/pods/3ea70342-b27f-45c9-9ae6-844509bd7bd9/volumes" Apr 22 17:40:20.549128 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:20.549083 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-787ll" event={"ID":"41a25f1a-672b-44a3-be6c-dd53a77e74e2","Type":"ContainerStarted","Data":"3186e19813090f7b2fe2ca964e7bd04642d8abae1830b698a37e0500dd821fa4"} Apr 22 17:40:20.549302 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:20.549213 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:20.565628 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:20.565578 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84ffddfb66-787ll" podStartSLOduration=2.258484314 podStartE2EDuration="2.565565128s" podCreationTimestamp="2026-04-22 17:40:18 +0000 UTC" firstStartedPulling="2026-04-22 17:40:19.352084102 +0000 UTC m=+405.819345782" lastFinishedPulling="2026-04-22 17:40:19.659164911 +0000 UTC m=+406.126426596" observedRunningTime="2026-04-22 17:40:20.563685478 +0000 UTC m=+407.030947179" watchObservedRunningTime="2026-04-22 17:40:20.565565128 +0000 UTC m=+407.032826829" Apr 22 17:40:51.557273 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:51.557242 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84ffddfb66-787ll" Apr 22 17:40:52.459830 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.459791 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-9zmvc"] Apr 22 17:40:52.460118 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.460106 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ea70342-b27f-45c9-9ae6-844509bd7bd9" containerName="manager" Apr 22 17:40:52.460167 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.460120 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ea70342-b27f-45c9-9ae6-844509bd7bd9" containerName="manager" Apr 22 17:40:52.460225 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.460214 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ea70342-b27f-45c9-9ae6-844509bd7bd9" containerName="manager" Apr 22 17:40:52.463372 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.463343 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.465846 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.465821 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 17:40:52.465980 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.465926 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-k4lqx\"" Apr 22 17:40:52.476448 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.476405 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9zmvc"] Apr 22 17:40:52.479044 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.479023 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-8pgqq"] Apr 22 17:40:52.482366 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.482346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.484966 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.484939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 17:40:52.485143 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.485121 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-rjgvq\"" Apr 22 17:40:52.491011 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.490981 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8pgqq"] Apr 22 17:40:52.610593 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.610553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5289715a-8c78-44e8-b650-fed6147a6394-cert\") pod \"odh-model-controller-696fc77849-8pgqq\" (UID: \"5289715a-8c78-44e8-b650-fed6147a6394\") " pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.611067 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.610622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4-tls-certs\") pod \"model-serving-api-86f7b4b499-9zmvc\" (UID: \"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4\") " pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.611067 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.610676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4zps\" (UniqueName: \"kubernetes.io/projected/c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4-kube-api-access-q4zps\") pod \"model-serving-api-86f7b4b499-9zmvc\" (UID: \"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4\") " pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.611067 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.610700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpmv\" (UniqueName: \"kubernetes.io/projected/5289715a-8c78-44e8-b650-fed6147a6394-kube-api-access-ljpmv\") pod \"odh-model-controller-696fc77849-8pgqq\" (UID: \"5289715a-8c78-44e8-b650-fed6147a6394\") " pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.711383 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.711304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5289715a-8c78-44e8-b650-fed6147a6394-cert\") pod \"odh-model-controller-696fc77849-8pgqq\" (UID: \"5289715a-8c78-44e8-b650-fed6147a6394\") " pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.711383 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.711355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4-tls-certs\") pod \"model-serving-api-86f7b4b499-9zmvc\" (UID: \"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4\") " pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.711600 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.711414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4zps\" (UniqueName: \"kubernetes.io/projected/c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4-kube-api-access-q4zps\") pod \"model-serving-api-86f7b4b499-9zmvc\" (UID: \"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4\") " pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.711600 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.711527 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpmv\" (UniqueName: \"kubernetes.io/projected/5289715a-8c78-44e8-b650-fed6147a6394-kube-api-access-ljpmv\") pod \"odh-model-controller-696fc77849-8pgqq\" (UID: \"5289715a-8c78-44e8-b650-fed6147a6394\") " pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.714009 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.713982 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4-tls-certs\") pod \"model-serving-api-86f7b4b499-9zmvc\" (UID: \"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4\") " pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.714009 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.714004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5289715a-8c78-44e8-b650-fed6147a6394-cert\") pod \"odh-model-controller-696fc77849-8pgqq\" (UID: \"5289715a-8c78-44e8-b650-fed6147a6394\") " pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.720001 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.719980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpmv\" (UniqueName: \"kubernetes.io/projected/5289715a-8c78-44e8-b650-fed6147a6394-kube-api-access-ljpmv\") pod \"odh-model-controller-696fc77849-8pgqq\" (UID: \"5289715a-8c78-44e8-b650-fed6147a6394\") " pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.720143 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.720121 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4zps\" (UniqueName: \"kubernetes.io/projected/c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4-kube-api-access-q4zps\") pod \"model-serving-api-86f7b4b499-9zmvc\" (UID: \"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4\") " pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.774874 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.774838 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:52.795645 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.795615 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:52.940776 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.940743 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9zmvc"] Apr 22 17:40:52.941905 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:40:52.941877 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5b0c19b_3e0f_4edd_b6f5_cdeee889f8a4.slice/crio-f000644e9245477929e55c567969b736f86aad97e43fb6cd9a2c0967174fe3f0 WatchSource:0}: Error finding container f000644e9245477929e55c567969b736f86aad97e43fb6cd9a2c0967174fe3f0: Status 404 returned error can't find the container with id f000644e9245477929e55c567969b736f86aad97e43fb6cd9a2c0967174fe3f0 Apr 22 17:40:52.963050 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:52.963028 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8pgqq"] Apr 22 17:40:52.965020 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:40:52.964996 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5289715a_8c78_44e8_b650_fed6147a6394.slice/crio-6afcb192eb21710c78c6bf62429c479b5b5bcfff85434e6763b444c96b5e273c WatchSource:0}: Error finding container 6afcb192eb21710c78c6bf62429c479b5b5bcfff85434e6763b444c96b5e273c: Status 404 returned error can't find the container with id 6afcb192eb21710c78c6bf62429c479b5b5bcfff85434e6763b444c96b5e273c Apr 22 17:40:53.658818 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:53.658679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9zmvc" event={"ID":"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4","Type":"ContainerStarted","Data":"f000644e9245477929e55c567969b736f86aad97e43fb6cd9a2c0967174fe3f0"} Apr 22 17:40:53.661230 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:53.661167 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8pgqq" event={"ID":"5289715a-8c78-44e8-b650-fed6147a6394","Type":"ContainerStarted","Data":"6afcb192eb21710c78c6bf62429c479b5b5bcfff85434e6763b444c96b5e273c"} Apr 22 17:40:57.677885 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:57.677843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9zmvc" event={"ID":"c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4","Type":"ContainerStarted","Data":"dd46cb3e1df5c0234a9e4e3bc39531bd2981cefe696419568c012669652f2ec1"} Apr 22 17:40:57.678388 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:57.677918 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:40:57.679181 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:57.679160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8pgqq" event={"ID":"5289715a-8c78-44e8-b650-fed6147a6394","Type":"ContainerStarted","Data":"502ebe4c21a9b9a4e8ed5ca15ea8d15556e33b651d85853da930bd638db6b65c"} Apr 22 17:40:57.679281 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:57.679271 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:40:57.696879 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:57.696830 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-9zmvc" podStartSLOduration=1.93913937 podStartE2EDuration="5.696811534s" podCreationTimestamp="2026-04-22 17:40:52 +0000 UTC" firstStartedPulling="2026-04-22 17:40:52.943912495 +0000 UTC m=+439.411174175" lastFinishedPulling="2026-04-22 17:40:56.701584641 +0000 UTC m=+443.168846339" observedRunningTime="2026-04-22 17:40:57.694887021 +0000 UTC m=+444.162148722" watchObservedRunningTime="2026-04-22 17:40:57.696811534 +0000 UTC m=+444.164073236" Apr 22 17:40:57.711031 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:40:57.710975 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-8pgqq" podStartSLOduration=2.030491008 podStartE2EDuration="5.710957783s" podCreationTimestamp="2026-04-22 17:40:52 +0000 UTC" firstStartedPulling="2026-04-22 17:40:52.966294061 +0000 UTC m=+439.433555741" lastFinishedPulling="2026-04-22 17:40:56.646760833 +0000 UTC m=+443.114022516" observedRunningTime="2026-04-22 17:40:57.710157563 +0000 UTC m=+444.177419263" watchObservedRunningTime="2026-04-22 17:40:57.710957783 +0000 UTC m=+444.178219486" Apr 22 17:41:08.684781 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:08.684750 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-8pgqq" Apr 22 17:41:08.686734 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:08.686708 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-9zmvc" Apr 22 17:41:28.921660 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:28.921622 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd"] Apr 22 17:41:28.932238 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:28.932202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:41:28.934832 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:28.934811 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2gdmh\"" Apr 22 17:41:28.935591 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:28.935563 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd"] Apr 22 17:41:29.024286 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:29.024246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ca5dbd9-5447-403a-81c9-b37ab6ec2393-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd\" (UID: \"9ca5dbd9-5447-403a-81c9-b37ab6ec2393\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:41:29.125047 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:29.125009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ca5dbd9-5447-403a-81c9-b37ab6ec2393-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd\" (UID: \"9ca5dbd9-5447-403a-81c9-b37ab6ec2393\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:41:29.125375 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:29.125352 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ca5dbd9-5447-403a-81c9-b37ab6ec2393-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd\" (UID: \"9ca5dbd9-5447-403a-81c9-b37ab6ec2393\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:41:29.244717 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:29.244636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:41:29.372236 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:29.372211 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd"] Apr 22 17:41:29.374850 ip-10-0-131-22 kubenswrapper[2572]: W0422 17:41:29.374818 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca5dbd9_5447_403a_81c9_b37ab6ec2393.slice/crio-857bc6f608e2ed9b5a9d90bb50d831fe82af1b9be76dd43086644bd881c0b160 WatchSource:0}: Error finding container 857bc6f608e2ed9b5a9d90bb50d831fe82af1b9be76dd43086644bd881c0b160: Status 404 returned error can't find the container with id 857bc6f608e2ed9b5a9d90bb50d831fe82af1b9be76dd43086644bd881c0b160 Apr 22 17:41:29.783681 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:29.783641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" event={"ID":"9ca5dbd9-5447-403a-81c9-b37ab6ec2393","Type":"ContainerStarted","Data":"857bc6f608e2ed9b5a9d90bb50d831fe82af1b9be76dd43086644bd881c0b160"} Apr 22 17:41:33.798541 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:33.798496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" event={"ID":"9ca5dbd9-5447-403a-81c9-b37ab6ec2393","Type":"ContainerStarted","Data":"6ecb5e5088e008b20f1c821a7135f3d16da7ea385d674216531666df8bd37574"} Apr 22 17:41:34.095941 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:34.095910 2572 scope.go:117] "RemoveContainer" containerID="f5ebc50145fa5e011688de62dcd62f448d1e61b57fed0d431dc280b6b73c36e5" Apr 22 17:41:37.817256 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:37.817221 2572 generic.go:358] "Generic (PLEG): container finished" podID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerID="6ecb5e5088e008b20f1c821a7135f3d16da7ea385d674216531666df8bd37574" exitCode=0 Apr 22 17:41:37.817656 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:37.817302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" event={"ID":"9ca5dbd9-5447-403a-81c9-b37ab6ec2393","Type":"ContainerDied","Data":"6ecb5e5088e008b20f1c821a7135f3d16da7ea385d674216531666df8bd37574"} Apr 22 17:41:51.877692 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:51.877606 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" event={"ID":"9ca5dbd9-5447-403a-81c9-b37ab6ec2393","Type":"ContainerStarted","Data":"5033fcbc05f6a3178329fde5b63e625a23e36b488b21916cea666d84891290be"} Apr 22 17:41:55.892616 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:55.892579 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" event={"ID":"9ca5dbd9-5447-403a-81c9-b37ab6ec2393","Type":"ContainerStarted","Data":"a54bc3bbc91b02ecad2ffa2c7979fe38ec866c442a5660ab7bfe1d106058baed"} Apr 22 17:41:55.893047 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:55.892866 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:41:55.894002 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:55.893973 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:41:55.911736 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:55.911689 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podStartSLOduration=2.469162903 podStartE2EDuration="27.911675083s" podCreationTimestamp="2026-04-22 17:41:28 +0000 UTC" firstStartedPulling="2026-04-22 17:41:29.376607177 +0000 UTC m=+475.843868858" lastFinishedPulling="2026-04-22 17:41:54.819119354 +0000 UTC m=+501.286381038" observedRunningTime="2026-04-22 17:41:55.910040289 +0000 UTC m=+502.377302016" watchObservedRunningTime="2026-04-22 17:41:55.911675083 +0000 UTC m=+502.378936785" Apr 22 17:41:56.896156 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:56.896118 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:41:56.896640 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:56.896274 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:41:56.897093 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:56.897067 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:41:57.899756 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:57.899718 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:41:57.900113 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:41:57.900017 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:42:07.899816 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:07.899772 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:42:07.900371 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:07.900251 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:42:17.899891 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:17.899839 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:42:17.900336 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:17.900294 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:42:27.900386 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:27.900285 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:42:27.900887 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:27.900865 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:42:37.900497 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:37.900451 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:42:37.900987 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:37.900963 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:42:47.900647 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:47.900596 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:42:47.901126 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:47.901023 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:42:57.900646 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:57.900602 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 17:42:57.901105 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:42:57.901078 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" podUID="9ca5dbd9-5447-403a-81c9-b37ab6ec2393" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:43:07.900624 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:43:07.900582 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:43:07.901088 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:43:07.900656 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd" Apr 22 17:43:35.261020 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:43:35.260908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:43:35.261020 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:43:35.260926 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:48:35.284258 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:48:35.284148 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:48:35.289948 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:48:35.285309 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:53:35.314712 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:53:35.314584 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:53:35.318718 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:53:35.316567 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:58:35.337506 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:58:35.337376 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 17:58:35.340662 ip-10-0-131-22 kubenswrapper[2572]: I0422 17:58:35.340032 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:03:35.361458 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:03:35.361331 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:03:35.366482 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:03:35.366455 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:08:35.384079 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:08:35.383970 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:08:35.389377 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:08:35.389356 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:13:35.406941 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:13:35.406908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:13:35.413042 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:13:35.413014 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:18:35.429201 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:18:35.429087 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:18:35.436255 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:18:35.436230 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:23:35.456440 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:23:35.456305 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:23:35.465715 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:23:35.465691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:28:35.483993 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:28:35.483831 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:28:35.492254 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:28:35.492232 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:33:35.507289 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:33:35.507183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:33:35.516437 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:33:35.516406 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:38:35.529869 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:38:35.529757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:38:35.540320 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:38:35.540298 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:41:31.734502 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:41:31.734406 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84ffddfb66-787ll_41a25f1a-672b-44a3-be6c-dd53a77e74e2/manager/0.log" Apr 22 18:43:35.552745 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:43:35.552629 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:43:35.564132 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:43:35.564108 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:44:32.347452 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:44:32.347357 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84ffddfb66-787ll_41a25f1a-672b-44a3-be6c-dd53a77e74e2/manager/0.log" Apr 22 18:48:35.575988 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:48:35.575869 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:48:35.590897 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:48:35.590867 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:53:35.602655 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:53:35.602544 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:53:35.614295 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:53:35.614267 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:58:35.624975 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:58:35.624845 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 18:58:35.637969 ip-10-0-131-22 kubenswrapper[2572]: I0422 18:58:35.637942 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 19:03:35.647931 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:03:35.647900 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 19:03:35.661286 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:03:35.661265 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 19:04:48.829675 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:48.829637 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:48.854134 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:48.854100 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:48.865846 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:48.865820 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:49.363281 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:49.363244 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:49.375030 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:49.375000 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:49.384589 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:49.384561 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:49.890211 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:49.890181 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:49.901547 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:49.901520 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:49.913572 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:49.913543 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:50.383768 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:50.383671 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:50.408492 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:50.408456 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:50.419396 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:50.419367 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:50.891250 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:50.891220 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:50.901999 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:50.901971 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:50.913928 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:50.913905 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:51.394470 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:51.394439 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:51.404253 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:51.404226 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:51.414009 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:51.413987 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:51.893606 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:51.893578 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:51.904319 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:51.904296 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:51.914178 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:51.914159 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:52.379555 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:52.379464 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:52.389472 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:52.389448 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:52.398664 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:52.398635 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:52.868718 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:52.868685 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:52.880199 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:52.880174 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:52.889778 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:52.889754 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:53.354035 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:53.354005 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:53.364197 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:53.364171 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:53.373969 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:53.373949 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:53.838103 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:53.838072 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:53.849648 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:53.849621 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:53.860483 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:53.860458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:54.340553 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:54.340520 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:54.350464 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:54.350433 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:54.359928 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:54.359902 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:54.822993 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:54.822962 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:54.833867 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:54.833825 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:54.849413 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:54.849379 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:55.335775 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:55.335742 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:55.345791 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:55.345763 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:55.354875 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:55.354847 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:55.801983 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:55.801954 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:55.812031 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:55.812001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:55.825197 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:55.825171 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:56.298511 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:56.298480 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:56.309945 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:56.309917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:56.320308 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:56.320281 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:56.842671 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:56.842643 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:56.853958 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:56.853931 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:56.863202 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:56.863178 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:57.366041 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:57.366005 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:57.375862 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:57.375837 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:57.385252 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:57.385228 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:57.886703 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:57.886670 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:57.896482 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:57.896459 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:57.906204 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:57.906182 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:58.389827 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:58.389799 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:58.399879 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:58.399845 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:58.410331 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:58.410307 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:04:58.887513 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:58.887486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/kserve-container/0.log" Apr 22 19:04:58.897336 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:58.897311 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/agent/0.log" Apr 22 19:04:58.906957 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:04:58.906932 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-1df94-predictor-685df7dbbc-g2ctd_9ca5dbd9-5447-403a-81c9-b37ab6ec2393/storage-initializer/0.log" Apr 22 19:05:03.505368 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:03.505340 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tngvx_afc146fd-f74a-41e0-b236-6b55673d7657/global-pull-secret-syncer/0.log" Apr 22 19:05:03.617710 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:03.617677 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6756n_de2b18a3-8db8-472b-8406-2443f8f9c9b3/konnectivity-agent/0.log" Apr 22 19:05:03.768831 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:03.768756 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-22.ec2.internal_43fb11857c40bae062d44a60922715d6/haproxy/0.log" Apr 22 19:05:07.490470 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:07.490436 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vpqhw_e9ca84a4-7331-4976-9476-b7842a9814e3/cluster-monitoring-operator/0.log" Apr 22 19:05:07.653007 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:07.652975 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8b28h_04770277-1292-4d62-8d2c-a5c19b46b73a/node-exporter/0.log" Apr 22 19:05:07.673388 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:07.673356 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8b28h_04770277-1292-4d62-8d2c-a5c19b46b73a/kube-rbac-proxy/0.log" Apr 22 19:05:07.693317 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:07.693292 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8b28h_04770277-1292-4d62-8d2c-a5c19b46b73a/init-textfile/0.log" Apr 22 19:05:09.538257 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:09.538224 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-pcpz9_59ff7442-70a1-4df1-a3a2-9eff7d027d6e/networking-console-plugin/0.log" Apr 22 19:05:09.955038 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:09.955001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/1.log" Apr 22 19:05:09.963914 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:09.963885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vvmj2_7bfe6b31-c032-43ba-be07-caa12af15041/console-operator/2.log" Apr 22 19:05:10.372135 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.372052 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-sm7xt_4cb6c6fd-89a2-40ee-b3e2-d562a853e308/download-server/0.log" Apr 22 19:05:10.775709 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.775677 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-v8lmw_8b7e852e-32bb-4bbe-be45-374b4376ee6d/volume-data-source-validator/0.log" Apr 22 19:05:10.776097 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.775902 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt"] Apr 22 19:05:10.779200 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.779182 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.781463 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.781445 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b8m7m\"/\"openshift-service-ca.crt\"" Apr 22 19:05:10.782400 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.782377 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b8m7m\"/\"default-dockercfg-vckz5\"" Apr 22 19:05:10.782511 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.782411 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b8m7m\"/\"kube-root-ca.crt\"" Apr 22 19:05:10.787615 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.787596 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt"] Apr 22 19:05:10.789345 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.789330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-lib-modules\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.789438 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.789365 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-proc\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.789503 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.789458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-sys\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.789556 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.789496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbj5f\" (UniqueName: \"kubernetes.io/projected/1fbdb498-be72-4742-83c3-3252a2313d7f-kube-api-access-tbj5f\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.789596 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.789567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-podres\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.890769 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-proc\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.890972 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-sys\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.890972 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbj5f\" (UniqueName: \"kubernetes.io/projected/1fbdb498-be72-4742-83c3-3252a2313d7f-kube-api-access-tbj5f\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.890972 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-podres\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.890972 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-proc\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.890972 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-sys\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.890972 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-lib-modules\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.891253 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.890986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-podres\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.891253 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.891037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fbdb498-be72-4742-83c3-3252a2313d7f-lib-modules\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:10.898014 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:10.897992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbj5f\" (UniqueName: \"kubernetes.io/projected/1fbdb498-be72-4742-83c3-3252a2313d7f-kube-api-access-tbj5f\") pod \"perf-node-gather-daemonset-nkgrt\" (UID: \"1fbdb498-be72-4742-83c3-3252a2313d7f\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:11.091300 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.091215 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:11.218089 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.218062 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt"] Apr 22 19:05:11.223993 ip-10-0-131-22 kubenswrapper[2572]: W0422 19:05:11.223960 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1fbdb498_be72_4742_83c3_3252a2313d7f.slice/crio-4493ce64840d3660417b626a6eaa6b6da375f7efe2d706eb0a2aff0375a68a23 WatchSource:0}: Error finding container 4493ce64840d3660417b626a6eaa6b6da375f7efe2d706eb0a2aff0375a68a23: Status 404 returned error can't find the container with id 4493ce64840d3660417b626a6eaa6b6da375f7efe2d706eb0a2aff0375a68a23 Apr 22 19:05:11.225885 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.225866 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:05:11.462541 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.462508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" event={"ID":"1fbdb498-be72-4742-83c3-3252a2313d7f","Type":"ContainerStarted","Data":"a5d212da087dff9348e7bd056ec4e5507d926b8ce0fa82a97ae30e9d892934f6"} Apr 22 19:05:11.462541 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.462543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" event={"ID":"1fbdb498-be72-4742-83c3-3252a2313d7f","Type":"ContainerStarted","Data":"4493ce64840d3660417b626a6eaa6b6da375f7efe2d706eb0a2aff0375a68a23"} Apr 22 19:05:11.462795 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.462674 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:11.478345 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.478300 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" podStartSLOduration=1.478286018 podStartE2EDuration="1.478286018s" podCreationTimestamp="2026-04-22 19:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:05:11.476688558 +0000 UTC m=+5497.943950271" watchObservedRunningTime="2026-04-22 19:05:11.478286018 +0000 UTC m=+5497.945547720" Apr 22 19:05:11.551394 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.551359 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vj4xl_b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a/dns/0.log" Apr 22 19:05:11.572928 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.572889 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vj4xl_b9d86a41-acf5-4d73-ab7e-be8d3ed73a9a/kube-rbac-proxy/0.log" Apr 22 19:05:11.638964 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:11.638925 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lxs9q_d3515122-d7cf-41fe-855d-d19ccfe73070/dns-node-resolver/0.log" Apr 22 19:05:12.094039 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:12.094003 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5lt7k_5ea00ae3-4a64-4435-be9b-6d9aec346440/node-ca/0.log" Apr 22 19:05:12.830896 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:12.830869 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-9d9fb4b58-dpchv_17af67ec-8577-45de-abbb-01a7199ee7cd/router/0.log" Apr 22 19:05:13.173709 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:13.173679 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s28sr_dabb0188-2e22-40cb-b765-c6e0a5a0b030/serve-healthcheck-canary/0.log" Apr 22 19:05:13.540331 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:13.540234 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7rk27_2dd100f6-0060-426d-9cf7-a7f9fafa003a/insights-operator/0.log" Apr 22 19:05:13.542826 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:13.542795 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7rk27_2dd100f6-0060-426d-9cf7-a7f9fafa003a/insights-operator/1.log" Apr 22 19:05:13.565029 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:13.564988 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-22kpx_65f406bd-2301-4de9-9b94-1fb285e03d6e/kube-rbac-proxy/0.log" Apr 22 19:05:13.582855 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:13.582828 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-22kpx_65f406bd-2301-4de9-9b94-1fb285e03d6e/exporter/0.log" Apr 22 19:05:13.604852 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:13.604822 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-22kpx_65f406bd-2301-4de9-9b94-1fb285e03d6e/extractor/0.log" Apr 22 19:05:15.607906 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:15.607873 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84ffddfb66-787ll_41a25f1a-672b-44a3-be6c-dd53a77e74e2/manager/0.log" Apr 22 19:05:15.647926 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:15.647899 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-9zmvc_c5b0c19b-3e0f-4edd-b6f5-cdeee889f8a4/server/0.log" Apr 22 19:05:15.739017 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:15.738982 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-8pgqq_5289715a-8c78-44e8-b650-fed6147a6394/manager/0.log" Apr 22 19:05:15.787302 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:15.787269 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-7hstq_d5d35495-0d87-4b0e-9315-49e44437afd4/seaweedfs/0.log" Apr 22 19:05:17.475978 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:17.475948 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-nkgrt" Apr 22 19:05:19.404744 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:19.404708 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-krtt6_ff80dfe5-183b-4c0c-90e5-3467e987deec/migrator/0.log" Apr 22 19:05:19.424872 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:19.424850 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-krtt6_ff80dfe5-183b-4c0c-90e5-3467e987deec/graceful-termination/0.log" Apr 22 19:05:19.764189 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:19.764155 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6bzf9_346331ec-1cea-46ea-8952-6af403c257c0/kube-storage-version-migrator-operator/1.log" Apr 22 19:05:19.765832 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:19.765809 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6bzf9_346331ec-1cea-46ea-8952-6af403c257c0/kube-storage-version-migrator-operator/0.log" Apr 22 19:05:21.055068 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.055034 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdl42_000311a6-600b-4136-89c9-336cdc563106/kube-multus-additional-cni-plugins/0.log" Apr 22 19:05:21.073354 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.073326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdl42_000311a6-600b-4136-89c9-336cdc563106/egress-router-binary-copy/0.log" Apr 22 19:05:21.091358 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.091335 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdl42_000311a6-600b-4136-89c9-336cdc563106/cni-plugins/0.log" Apr 22 19:05:21.110285 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.110259 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdl42_000311a6-600b-4136-89c9-336cdc563106/bond-cni-plugin/0.log" Apr 22 19:05:21.143352 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.143323 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdl42_000311a6-600b-4136-89c9-336cdc563106/routeoverride-cni/0.log" Apr 22 19:05:21.170868 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.170839 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdl42_000311a6-600b-4136-89c9-336cdc563106/whereabouts-cni-bincopy/0.log" Apr 22 19:05:21.189004 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.188978 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rdl42_000311a6-600b-4136-89c9-336cdc563106/whereabouts-cni/0.log" Apr 22 19:05:21.364669 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.364584 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-snt7h_fea00d3c-4a77-47e4-84b9-89677ae7426c/kube-multus/0.log" Apr 22 19:05:21.427078 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.427031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-s8svp_9feb1c60-1e90-405e-9beb-753e0747aed0/network-metrics-daemon/0.log" Apr 22 19:05:21.446323 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:21.446295 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-s8svp_9feb1c60-1e90-405e-9beb-753e0747aed0/kube-rbac-proxy/0.log" Apr 22 19:05:22.624047 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.623959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/ovn-controller/0.log" Apr 22 19:05:22.690289 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.690251 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/ovn-acl-logging/0.log" Apr 22 19:05:22.708021 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.707984 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/kube-rbac-proxy-node/0.log" Apr 22 19:05:22.728260 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.728226 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:05:22.745573 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.745543 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/northd/0.log" Apr 22 19:05:22.763673 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.763649 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/nbdb/0.log" Apr 22 19:05:22.782477 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.782451 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/sbdb/0.log" Apr 22 19:05:22.930046 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:22.930022 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knlln_2fec8229-262a-438e-a71a-26d8ef9fda02/ovnkube-controller/0.log" Apr 22 19:05:24.187938 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:24.187904 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-2qtfd_daeace97-112e-453b-ae2c-bd7b73b63cc1/check-endpoints/0.log" Apr 22 19:05:24.207533 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:24.207496 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mh726_7c3eadbd-c6af-4686-bed4-c3a47b257864/network-check-target-container/0.log" Apr 22 19:05:25.142703 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:25.142671 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rhnfm_a585179c-cf1a-4d62-9c9c-9f26a01e3c39/iptables-alerter/0.log" Apr 22 19:05:25.723925 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:25.723891 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-rf2jd_dc7f3edc-3323-4fbb-8f3f-7862dfc56b51/tuned/0.log" Apr 22 19:05:27.329375 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:27.329341 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-cp6f7_001a3e34-ff95-45f5-a62e-b8389b0e0df0/cluster-samples-operator/0.log" Apr 22 19:05:27.345998 ip-10-0-131-22 kubenswrapper[2572]: I0422 19:05:27.345967 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-cp6f7_001a3e34-ff95-45f5-a62e-b8389b0e0df0/cluster-samples-operator-watch/0.log"