Apr 22 17:50:47.061140 ip-10-0-132-24 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 17:50:47.061149 ip-10-0-132-24 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 17:50:47.061156 ip-10-0-132-24 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 17:50:47.061379 ip-10-0-132-24 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 17:50:57.115892 ip-10-0-132-24 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 17:50:57.115905 ip-10-0-132-24 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2b64800cbd3d4fda9ef6bf47d1f54c7e -- Apr 22 17:53:05.202101 ip-10-0-132-24 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:53:05.631259 ip-10-0-132-24 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:05.631259 ip-10-0-132-24 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:53:05.631259 ip-10-0-132-24 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:05.631259 ip-10-0-132-24 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:53:05.631259 ip-10-0-132-24 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:05.632471 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.632396 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:53:05.634517 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634503 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:05.634517 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634517 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634521 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634524 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634528 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634530 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634533 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634536 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634539 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634541 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634544 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634547 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634549 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634552 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634554 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634557 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634559 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634562 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634564 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634566 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634569 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:05.634587 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634571 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634574 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634576 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634579 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634581 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634588 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634592 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634594 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634597 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634600 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634603 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634605 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634608 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634610 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634613 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634615 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634618 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634620 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634623 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634625 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:05.635066 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634628 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634630 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634632 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634635 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634637 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634640 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634643 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634645 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634649 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634661 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634664 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634666 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634669 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634671 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634674 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634677 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634680 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634683 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634685 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:05.635586 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634688 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634690 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634694 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634698 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634701 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634703 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634706 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634709 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634711 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634714 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634717 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634719 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634722 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634724 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634727 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634730 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634733 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634736 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634738 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:05.636075 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634741 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634743 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634746 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634748 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634751 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634753 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.634756 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635118 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635123 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635127 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635131 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635134 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635137 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635139 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635142 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635144 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635147 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635149 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635152 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:05.636538 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635155 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635157 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635160 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635162 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635164 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635167 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635169 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635172 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635175 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635177 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635180 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635182 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635185 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635187 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635190 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635192 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635195 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635197 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635200 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635202 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:05.637019 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635205 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635208 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635210 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635219 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635222 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635224 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635227 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635231 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635234 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635237 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635239 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635242 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635244 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635247 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635249 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635252 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635254 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635257 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635259 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:05.637493 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635262 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635265 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635267 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635270 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635272 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635275 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635278 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635280 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635283 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635285 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635288 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635291 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635293 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635296 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635299 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635302 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635305 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635307 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635310 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635312 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:05.637970 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635314 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635317 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635320 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635322 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635325 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635327 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635329 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635332 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635334 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635337 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635339 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635341 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635344 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635346 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.635349 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635447 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635454 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635462 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635467 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635471 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635474 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:53:05.638445 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635479 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635483 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635486 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635489 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635493 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635497 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635500 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635503 2574 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635506 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635509 2574 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635512 2574 flags.go:64] FLAG: --cloud-config="" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635515 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635518 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635522 2574 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635525 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635528 2574 flags.go:64] FLAG: --config-dir="" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635531 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635534 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635537 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635540 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635543 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635547 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635549 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635552 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:53:05.638979 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635555 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635559 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635562 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635567 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635570 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635573 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635576 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635579 2574 flags.go:64] FLAG: --enable-server="true" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635582 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635586 2574 flags.go:64] FLAG: --event-burst="100" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635589 2574 flags.go:64] FLAG: --event-qps="50" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635592 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635595 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635599 2574 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635602 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635606 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635608 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635611 2574 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635615 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635617 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635620 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635623 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635626 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635629 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635632 2574 flags.go:64] FLAG: --feature-gates="" Apr 22 17:53:05.639596 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635635 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635638 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635641 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635644 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635647 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635650 2574 flags.go:64] FLAG: --help="false" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635653 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-132-24.ec2.internal" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635656 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635659 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635662 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635666 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635670 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635673 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635676 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635679 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635682 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635685 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635688 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635691 2574 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635694 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635696 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635699 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635702 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635705 2574 flags.go:64] FLAG: --lock-file="" Apr 22 17:53:05.640237 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635707 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635710 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635713 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635718 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635721 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635724 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635727 2574 flags.go:64] FLAG: --logging-format="text" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635733 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635737 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635740 2574 flags.go:64] FLAG: --manifest-url="" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635743 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635747 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635750 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635754 2574 flags.go:64] FLAG: --max-pods="110" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635757 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635760 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635763 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635766 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635769 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635771 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635775 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635782 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635785 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635788 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:53:05.640910 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635791 2574 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635794 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635799 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635801 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635804 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635807 2574 flags.go:64] FLAG: --port="10250" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635810 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635813 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-048dc78ebda5d91f0" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635816 2574 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635819 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635822 2574 flags.go:64] FLAG: --register-node="true" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635824 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635827 2574 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635844 2574 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635848 2574 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635850 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635856 2574 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635859 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635863 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635866 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635869 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635872 2574 flags.go:64] FLAG: --runonce="false" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635875 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635878 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635881 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:53:05.641479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635884 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635887 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635889 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635893 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635896 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635899 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635902 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635904 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635908 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635911 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635914 2574 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635916 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635923 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635926 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635929 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635932 2574 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635935 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635938 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635941 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635944 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635947 2574 flags.go:64] FLAG: --v="2" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635951 2574 flags.go:64] FLAG: --version="false" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635955 2574 flags.go:64] FLAG: --vmodule="" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635960 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.635963 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:53:05.642112 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636045 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636049 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636052 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636054 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636057 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636060 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636062 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636065 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636068 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636071 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636073 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636076 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636080 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636083 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636085 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636089 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636091 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636094 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636096 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636100 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:05.642695 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636103 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636106 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636108 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636111 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636113 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636116 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636118 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636121 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636123 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636126 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636130 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636133 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636135 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636139 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636142 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636145 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636148 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636150 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636153 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636155 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:05.643216 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636158 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636161 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636165 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636168 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636171 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636177 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636181 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636183 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636186 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636188 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636191 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636195 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636198 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636201 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636203 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636205 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636208 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636211 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636213 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:05.643747 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636215 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636218 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636220 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636224 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636227 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636229 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636232 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636234 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636237 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636239 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636242 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636244 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636247 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636249 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636252 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636254 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636257 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636259 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636262 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636265 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:05.644224 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636267 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636270 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636272 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636275 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636278 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636281 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.636283 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.636289 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.642131 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.642145 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642189 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642194 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642198 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642201 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642204 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642207 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:05.644724 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642210 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642214 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642217 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642219 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642222 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642225 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642228 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642231 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642234 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642237 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642239 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642242 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642246 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642249 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642252 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642256 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642259 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642262 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642264 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:05.645193 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642267 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642269 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642271 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642274 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642277 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642279 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642283 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642286 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642288 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642291 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642293 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642296 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642298 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642301 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642303 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642306 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642308 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642311 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642313 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642315 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:05.645639 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642318 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642320 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642322 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642325 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642328 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642330 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642332 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642336 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642341 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642343 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642347 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642350 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642353 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642355 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642358 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642361 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642363 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642366 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642369 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:05.646126 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642372 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642375 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642377 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642380 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642382 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642385 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642387 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642390 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642392 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642395 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642397 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642400 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642402 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642405 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642407 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642410 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642413 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642416 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642418 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642421 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:05.646604 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642423 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642426 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.642430 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642537 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642542 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642545 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642548 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642550 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642553 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642557 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642561 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642563 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642567 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642576 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642579 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642581 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:05.647091 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642584 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642586 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642589 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642591 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642595 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642598 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642601 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642603 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642605 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642608 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642610 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642613 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642615 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642618 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642620 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642623 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642625 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642627 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642630 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:05.647500 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642632 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642635 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642637 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642640 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642642 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642645 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642647 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642650 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642652 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642655 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642657 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642665 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642668 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642670 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642673 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642675 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642677 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642680 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642682 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642684 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:05.647974 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642687 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642689 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642691 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642694 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642697 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642700 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642702 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642704 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642707 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642709 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642712 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642714 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642717 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642719 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642721 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642724 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642726 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642729 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642731 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642734 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:05.648459 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642736 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642739 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642742 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642744 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642754 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642757 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642759 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642762 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642764 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642766 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642769 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642771 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642774 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:05.642776 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.642781 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:05.649038 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.643472 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:53:05.649390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.646528 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:53:05.649390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.647825 2574 server.go:1019] "Starting client certificate rotation" Apr 22 17:53:05.649390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.647931 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:05.649390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.648969 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:05.672220 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.672204 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:05.675574 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.675554 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:05.690934 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.690919 2574 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:53:05.698041 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.698025 2574 log.go:25] "Validated CRI v1 image API" Apr 22 17:53:05.699493 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.699477 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:53:05.702283 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.702260 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:05.704107 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.704087 2574 fs.go:135] Filesystem UUIDs: map[0a390b94-ee15-4774-bf3a-6a818686fe30:/dev/nvme0n1p4 14a0eba0-691f-43b9-8061-35c3902214bf:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 17:53:05.704190 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.704106 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:53:05.710085 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.709988 2574 manager.go:217] Machine: {Timestamp:2026-04-22 17:53:05.707775425 +0000 UTC m=+0.395235823 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3198877 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2986d032953c658ab4ba2a6792e21d SystemUUID:ec2986d0-3295-3c65-8ab4-ba2a6792e21d BootID:2b64800c-bd3d-4fda-9ef6-bf47d1f54c7e Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3f:02:09:07:77 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3f:02:09:07:77 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:34:01:8c:ab:af Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:53:05.710085 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.710080 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:53:05.710177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.710153 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:53:05.711370 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.711347 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:53:05.711521 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.711373 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-24.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:53:05.711563 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.711530 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:53:05.711563 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.711538 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:53:05.711563 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.711551 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:05.712442 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.712430 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:05.713386 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.713376 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:05.713487 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.713478 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:53:05.716144 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.716134 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:53:05.716177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.716153 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:53:05.716177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.716165 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:53:05.716177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.716174 2574 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:53:05.716289 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.716182 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:53:05.717232 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.717220 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:05.717271 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.717239 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:05.719788 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.719769 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c97xd" Apr 22 17:53:05.720279 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.720263 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:53:05.721789 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.721776 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:53:05.723994 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.723977 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:53:05.724038 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724010 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:53:05.724038 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724023 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:53:05.724038 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724037 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:53:05.724120 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724049 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:53:05.724120 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724062 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:53:05.724120 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724075 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:53:05.724120 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724089 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:53:05.724221 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724122 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:53:05.724221 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724136 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:53:05.724221 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724163 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:53:05.724221 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.724183 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:53:05.726259 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.726247 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:53:05.726259 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.726260 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:53:05.727633 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.727613 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c97xd" Apr 22 17:53:05.727714 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.727654 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:53:05.727906 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.727888 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-24.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:53:05.730402 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.730390 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:53:05.730456 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.730429 2574 server.go:1295] "Started kubelet" Apr 22 17:53:05.730533 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.730514 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:53:05.730604 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.730568 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:53:05.730650 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.730621 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:53:05.733030 ip-10-0-132-24 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:53:05.735398 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.735370 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:53:05.736572 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.736554 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:53:05.742677 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.742661 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:53:05.742677 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.742670 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:05.742819 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.742686 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:53:05.743347 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743330 2574 factory.go:55] Registering systemd factory Apr 22 17:53:05.743411 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743351 2574 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:53:05.743573 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743473 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:53:05.743573 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743491 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:53:05.743573 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743562 2574 factory.go:153] Registering CRI-O factory Apr 22 17:53:05.743728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743576 2574 factory.go:223] Registration of the crio container factory successfully Apr 22 17:53:05.743728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743579 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:53:05.743728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743628 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:53:05.743728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743637 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:53:05.743728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743629 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:53:05.743728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743669 2574 factory.go:103] Registering Raw factory Apr 22 17:53:05.743728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.743700 2574 manager.go:1196] Started watching for new ooms in manager Apr 22 17:53:05.743983 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.743746 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:05.744435 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.744422 2574 manager.go:319] Starting recovery of all containers Apr 22 17:53:05.744926 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.744909 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:05.745619 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.745599 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-24.ec2.internal" not found Apr 22 17:53:05.748514 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.748491 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-24.ec2.internal\" not found" node="ip-10-0-132-24.ec2.internal" Apr 22 17:53:05.751034 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.750988 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:53:05.755384 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.755266 2574 manager.go:324] Recovery completed Apr 22 17:53:05.759856 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.759823 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:05.761971 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.761945 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-24.ec2.internal" not found Apr 22 17:53:05.762291 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.762277 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:05.762340 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.762303 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:05.762340 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.762314 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:05.762748 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.762731 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:53:05.762748 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.762747 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:53:05.762824 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.762767 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:05.766529 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.766516 2574 policy_none.go:49] "None policy: Start" Apr 22 17:53:05.766529 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.766531 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:53:05.766622 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.766541 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.806964 2574 manager.go:341] "Starting Device Plugin manager" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.806994 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.807007 2574 server.go:85] "Starting device plugin registration server" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.807248 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.807271 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.807438 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.807512 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.807521 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.808035 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.808087 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:05.827143 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.818330 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-24.ec2.internal" not found Apr 22 17:53:05.850867 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.850847 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:53:05.850945 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.850875 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:53:05.850945 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.850889 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:53:05.850945 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.850894 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:53:05.850945 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.850923 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:53:05.853450 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.853436 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:05.907704 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.907638 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:05.908482 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.908465 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:05.908556 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.908497 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:05.908556 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.908509 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:05.908556 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.908531 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-24.ec2.internal" Apr 22 17:53:05.918581 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.918567 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-24.ec2.internal" Apr 22 17:53:05.918640 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.918588 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-24.ec2.internal\": node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:05.934699 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.934676 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:05.951256 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.951235 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal"] Apr 22 17:53:05.951316 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.951290 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:05.952006 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.951991 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:05.952068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.952017 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:05.952068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.952028 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:05.954248 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954237 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:05.954412 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954399 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:05.954450 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954426 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:05.954877 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954864 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:05.954877 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954872 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:05.954978 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954893 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:05.954978 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954904 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:05.954978 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954903 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:05.954978 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.954961 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:05.956993 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.956976 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" Apr 22 17:53:05.957063 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.957005 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:05.957635 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.957622 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:05.957697 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.957644 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:05.957697 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:05.957653 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:05.984445 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.984427 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-24.ec2.internal\" not found" node="ip-10-0-132-24.ec2.internal" Apr 22 17:53:05.988668 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:05.988655 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-24.ec2.internal\" not found" node="ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.034905 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.034888 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.044894 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.044875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/017fe086330ddbb9d8db4ca21ed5605a-config\") pod \"kube-apiserver-proxy-ip-10-0-132-24.ec2.internal\" (UID: \"017fe086330ddbb9d8db4ca21ed5605a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.044946 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.044899 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a90191fab7fb65406914e2962ac77e93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal\" (UID: \"a90191fab7fb65406914e2962ac77e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.044946 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.044917 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a90191fab7fb65406914e2962ac77e93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal\" (UID: \"a90191fab7fb65406914e2962ac77e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.135315 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.135299 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.145639 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.145623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a90191fab7fb65406914e2962ac77e93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal\" (UID: \"a90191fab7fb65406914e2962ac77e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.145697 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.145643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a90191fab7fb65406914e2962ac77e93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal\" (UID: \"a90191fab7fb65406914e2962ac77e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.145697 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.145662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/017fe086330ddbb9d8db4ca21ed5605a-config\") pod \"kube-apiserver-proxy-ip-10-0-132-24.ec2.internal\" (UID: \"017fe086330ddbb9d8db4ca21ed5605a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.145761 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.145697 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/017fe086330ddbb9d8db4ca21ed5605a-config\") pod \"kube-apiserver-proxy-ip-10-0-132-24.ec2.internal\" (UID: \"017fe086330ddbb9d8db4ca21ed5605a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.145761 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.145710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a90191fab7fb65406914e2962ac77e93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal\" (UID: \"a90191fab7fb65406914e2962ac77e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.145761 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.145719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a90191fab7fb65406914e2962ac77e93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal\" (UID: \"a90191fab7fb65406914e2962ac77e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.236018 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.236000 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.286491 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.286474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.290885 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.290871 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.336375 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.336352 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.436888 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.436867 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.537425 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.537383 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.637880 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.637861 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.647221 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.647209 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:53:06.647334 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.647318 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:06.647369 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.647350 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:06.729889 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.729860 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:48:05 +0000 UTC" deadline="2028-01-24 01:39:17.83902782 +0000 UTC" Apr 22 17:53:06.729889 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.729884 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15391h46m11.109145734s" Apr 22 17:53:06.737919 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.737903 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.743075 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.743060 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:06.760995 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.760969 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:06.779957 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.779930 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lqtz2" Apr 22 17:53:06.785992 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.785972 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lqtz2" Apr 22 17:53:06.838797 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:06.838746 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-24.ec2.internal\" not found" Apr 22 17:53:06.868792 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:06.868740 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017fe086330ddbb9d8db4ca21ed5605a.slice/crio-b1721311d925837d501cd5b2898bb96ae25a2ee55b14e3313af821b74c6565f6 WatchSource:0}: Error finding container b1721311d925837d501cd5b2898bb96ae25a2ee55b14e3313af821b74c6565f6: Status 404 returned error can't find the container with id b1721311d925837d501cd5b2898bb96ae25a2ee55b14e3313af821b74c6565f6 Apr 22 17:53:06.869116 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:06.869096 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90191fab7fb65406914e2962ac77e93.slice/crio-12e7bc8ff483d15f730567bc354d76d485fc8cb896eac7d7a890c5bffa631e87 WatchSource:0}: Error finding container 12e7bc8ff483d15f730567bc354d76d485fc8cb896eac7d7a890c5bffa631e87: Status 404 returned error can't find the container with id 12e7bc8ff483d15f730567bc354d76d485fc8cb896eac7d7a890c5bffa631e87 Apr 22 17:53:06.873458 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.873443 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:06.881549 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.881534 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:06.943748 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.943729 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.954641 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.954628 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:06.956906 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.956894 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" Apr 22 17:53:06.965906 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:06.965887 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:07.270474 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.270448 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:07.717663 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.717586 2574 apiserver.go:52] "Watching apiserver" Apr 22 17:53:07.725432 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.725401 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:53:07.726714 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.726687 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal","openshift-multus/multus-cq6p9","openshift-multus/network-metrics-daemon-fztfm","openshift-network-operator/iptables-alerter-lql7j","openshift-ovn-kubernetes/ovnkube-node-zrmhz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk","openshift-cluster-node-tuning-operator/tuned-np549","openshift-image-registry/node-ca-nzqmg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal","openshift-multus/multus-additional-cni-plugins-972f2","openshift-network-diagnostics/network-check-target-4jvnk","kube-system/konnectivity-agent-lmpbm"] Apr 22 17:53:07.731388 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.731353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.734616 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.734588 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:07.734716 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.734669 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:07.735325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.735303 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:53:07.735325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.735308 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:53:07.735542 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.735326 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2frrq\"" Apr 22 17:53:07.735542 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.735508 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:53:07.735635 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.735566 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:53:07.737135 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.736783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.737135 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.736875 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.739183 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.739158 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:07.739795 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.739485 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:53:07.739795 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.739521 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:53:07.739795 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.739581 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9ml8k\"" Apr 22 17:53:07.739795 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.739583 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:53:07.739795 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.739736 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cxxtn\"" Apr 22 17:53:07.740416 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.740398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:53:07.740416 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.740411 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:07.741402 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.741384 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.743629 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.743608 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:07.743772 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.743752 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:07.743858 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.743806 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.743919 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.743757 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnfvt\"" Apr 22 17:53:07.746034 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.746017 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.746507 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.746136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:07.746774 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.746755 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:53:07.746872 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.746816 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5mp6q\"" Apr 22 17:53:07.746872 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.746820 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:53:07.746984 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.746898 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:53:07.748251 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.748076 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:53:07.748817 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.748547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-h2rxh\"" Apr 22 17:53:07.748817 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.748635 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:53:07.748943 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.748898 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:53:07.749991 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.749955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:53:07.750122 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.750023 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v965b\"" Apr 22 17:53:07.750735 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.750421 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.752777 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.752762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:07.752887 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.752814 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:07.753026 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.752992 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:53:07.753156 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.753140 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:53:07.753420 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.753366 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:53:07.753499 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.753416 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:53:07.753499 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.753429 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t78px\"" Apr 22 17:53:07.753679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.753652 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:53:07.754387 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754370 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:53:07.754886 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-system-cni-dir\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.754886 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754743 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkgp\" (UniqueName: \"kubernetes.io/projected/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-kube-api-access-vvkgp\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.754886 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-daemon-config\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.754886 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr9gl\" (UniqueName: \"kubernetes.io/projected/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-kube-api-access-vr9gl\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:07.754886 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-modprobe-d\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.755154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-os-release\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.755154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.754954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-system-cni-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.755154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a684f094-f2e9-4f18-b33b-e466f94313d8-host\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.755154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755053 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-systemd\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.755154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.755154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f0aa83e-c1fb-48e4-b074-4915c38e5138-konnectivity-ca\") pod \"konnectivity-agent-lmpbm\" (UID: \"0f0aa83e-c1fb-48e4-b074-4915c38e5138\") " pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:07.755688 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755412 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-socket-dir-parent\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.755688 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-hostroot\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.755688 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755491 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysctl-d\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.755688 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f0aa83e-c1fb-48e4-b074-4915c38e5138-agent-certs\") pod \"konnectivity-agent-lmpbm\" (UID: \"0f0aa83e-c1fb-48e4-b074-4915c38e5138\") " pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:07.755688 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-run\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.756126 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755602 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-tuned\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.756126 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-cni-multus\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.756126 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxbmq\" (UniqueName: \"kubernetes.io/projected/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-kube-api-access-pxbmq\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.756126 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.755963 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-registration-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.756126 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysctl-conf\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.756126 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-cni-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.756402 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756287 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-conf-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.756402 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.756601 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:07.756601 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrbc\" (UniqueName: \"kubernetes.io/projected/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-kube-api-access-zxrbc\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.756601 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-var-lib-kubelet\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.756601 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-netns\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.756885 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-iptables-alerter-script\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.756885 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysconfig\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.756885 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-sys\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.756885 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756876 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-multus-certs\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756898 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-cni-bin\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-etc-kubernetes\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.756989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-cni-binary-copy\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757009 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-lib-modules\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.757068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwf9\" (UniqueName: \"kubernetes.io/projected/5b4c353a-3fa1-44c0-954e-74df34b1b224-kube-api-access-9xwf9\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.757292 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757110 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.757292 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-sys-fs\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.757390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9bhh\" (UniqueName: \"kubernetes.io/projected/a684f094-f2e9-4f18-b33b-e466f94313d8-kube-api-access-x9bhh\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.757390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtb7s\" (UniqueName: \"kubernetes.io/projected/60335f89-00b6-4100-bb92-7e321aab6731-kube-api-access-gtb7s\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.757390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-host-slash\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.757524 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757431 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-os-release\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757574 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757503 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-device-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.757625 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757604 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-etc-selinux\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.757679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b4c353a-3fa1-44c0-954e-74df34b1b224-tmp\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.757735 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-cnibin\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757791 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-k8s-cni-cncf-io\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757791 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-kubelet\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.757791 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.757956 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a684f094-f2e9-4f18-b33b-e466f94313d8-serviceca\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.757956 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757898 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.757956 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-kubernetes\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.758095 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.757964 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-host\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.758095 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.758004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cnibin\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.758095 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.758031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-socket-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.786701 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.786673 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:06 +0000 UTC" deadline="2028-01-12 15:55:30.091898172 +0000 UTC" Apr 22 17:53:07.786701 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.786700 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15118h2m22.305200366s" Apr 22 17:53:07.844295 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.844275 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:53:07.854713 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.854664 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" event={"ID":"017fe086330ddbb9d8db4ca21ed5605a","Type":"ContainerStarted","Data":"b1721311d925837d501cd5b2898bb96ae25a2ee55b14e3313af821b74c6565f6"} Apr 22 17:53:07.855534 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.855516 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" event={"ID":"a90191fab7fb65406914e2962ac77e93","Type":"ContainerStarted","Data":"12e7bc8ff483d15f730567bc354d76d485fc8cb896eac7d7a890c5bffa631e87"} Apr 22 17:53:07.858750 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.858726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.858827 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.858764 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f0aa83e-c1fb-48e4-b074-4915c38e5138-konnectivity-ca\") pod \"konnectivity-agent-lmpbm\" (UID: \"0f0aa83e-c1fb-48e4-b074-4915c38e5138\") " pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:07.858827 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.858785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-socket-dir-parent\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.858827 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.858800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-hostroot\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.859000 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.858854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-var-lib-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.859000 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.858973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.859076 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-socket-dir-parent\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.859076 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysctl-d\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.859162 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f0aa83e-c1fb-48e4-b074-4915c38e5138-agent-certs\") pod \"konnectivity-agent-lmpbm\" (UID: \"0f0aa83e-c1fb-48e4-b074-4915c38e5138\") " pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:07.859162 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-hostroot\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.859162 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859108 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-ovn\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.859162 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-log-socket\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.859325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4abe7788-23bd-436c-bc7c-1de96634aa32-ovn-node-metrics-cert\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.859325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-ovnkube-script-lib\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.859325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-run\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.859325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859240 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysctl-d\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.859325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-tuned\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.859325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-cni-multus\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.859325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxbmq\" (UniqueName: \"kubernetes.io/projected/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-kube-api-access-pxbmq\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.859652 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-cni-multus\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.859652 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859449 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-run\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.859751 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859706 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:53:07.859926 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859899 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-etc-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.859987 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f0aa83e-c1fb-48e4-b074-4915c38e5138-konnectivity-ca\") pod \"konnectivity-agent-lmpbm\" (UID: \"0f0aa83e-c1fb-48e4-b074-4915c38e5138\") " pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:07.860049 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-registration-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.860049 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.859999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-registration-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.860049 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysctl-conf\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.860177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-cni-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.860177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-conf-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.860177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-systemd-units\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.860177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.860177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysctl-conf\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860207 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-conf-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-cni-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrbc\" (UniqueName: \"kubernetes.io/projected/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-kube-api-access-zxrbc\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-var-lib-kubelet\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-netns\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.860351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860330 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-run-ovn-kubernetes\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-cni-bin\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-var-lib-kubelet\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-cni-netd\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-iptables-alerter-script\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.860531 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysconfig\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-sys\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.860632 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:08.360600032 +0000 UTC m=+3.048060417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:07.860679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-sys\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-multus-certs\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-node-log\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-netns\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-sysconfig\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-multus-certs\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-cni-bin\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-etc-kubernetes\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-cni-bin\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-cni-binary-copy\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-etc-kubernetes\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.860995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-lib-modules\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.861074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwf9\" (UniqueName: \"kubernetes.io/projected/5b4c353a-3fa1-44c0-954e-74df34b1b224-kube-api-access-9xwf9\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-sys-fs\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9bhh\" (UniqueName: \"kubernetes.io/projected/a684f094-f2e9-4f18-b33b-e466f94313d8-kube-api-access-x9bhh\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861154 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-run-netns\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-iptables-alerter-script\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861177 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-lib-modules\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-sys-fs\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861296 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtb7s\" (UniqueName: \"kubernetes.io/projected/60335f89-00b6-4100-bb92-7e321aab6731-kube-api-access-gtb7s\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-host-slash\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-os-release\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861394 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-device-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-etc-selinux\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b4c353a-3fa1-44c0-954e-74df34b1b224-tmp\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-cni-binary-copy\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-cnibin\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.861699 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861634 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-k8s-cni-cncf-io\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861665 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-kubelet\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-etc-selinux\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a684f094-f2e9-4f18-b33b-e466f94313d8-serviceca\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861749 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-run-k8s-cni-cncf-io\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861639 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-os-release\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-device-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-cnibin\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861819 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-host-slash\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-kubernetes\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-host\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-host-var-lib-kubelet\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cnibin\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.862430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-kubernetes\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cnibin\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-host\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.861980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-socket-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-kubelet\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-ovnkube-config\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862049 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-env-overrides\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862070 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-system-cni-dir\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkgp\" (UniqueName: \"kubernetes.io/projected/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-kube-api-access-vvkgp\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-system-cni-dir\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-daemon-config\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr9gl\" (UniqueName: \"kubernetes.io/projected/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-kube-api-access-vr9gl\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-modprobe-d\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-os-release\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-system-cni-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a684f094-f2e9-4f18-b33b-e466f94313d8-host\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862310 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a684f094-f2e9-4f18-b33b-e466f94313d8-serviceca\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.863137 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862332 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-slash\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862357 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-systemd\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862404 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdpv\" (UniqueName: \"kubernetes.io/projected/4abe7788-23bd-436c-bc7c-1de96634aa32-kube-api-access-vjdpv\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-systemd\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862442 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-system-cni-dir\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60335f89-00b6-4100-bb92-7e321aab6731-socket-dir\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a684f094-f2e9-4f18-b33b-e466f94313d8-host\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862670 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-modprobe-d\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-os-release\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-systemd\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.862738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-multus-daemon-config\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.863137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5b4c353a-3fa1-44c0-954e-74df34b1b224-etc-tuned\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.863691 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.863250 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f0aa83e-c1fb-48e4-b074-4915c38e5138-agent-certs\") pod \"konnectivity-agent-lmpbm\" (UID: \"0f0aa83e-c1fb-48e4-b074-4915c38e5138\") " pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:07.864220 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.864205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b4c353a-3fa1-44c0-954e-74df34b1b224-tmp\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.882202 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.882019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrbc\" (UniqueName: \"kubernetes.io/projected/2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6-kube-api-access-zxrbc\") pod \"iptables-alerter-lql7j\" (UID: \"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6\") " pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:07.882202 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.882159 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xwf9\" (UniqueName: \"kubernetes.io/projected/5b4c353a-3fa1-44c0-954e-74df34b1b224-kube-api-access-9xwf9\") pod \"tuned-np549\" (UID: \"5b4c353a-3fa1-44c0-954e-74df34b1b224\") " pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:07.883061 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.882928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtb7s\" (UniqueName: \"kubernetes.io/projected/60335f89-00b6-4100-bb92-7e321aab6731-kube-api-access-gtb7s\") pod \"aws-ebs-csi-driver-node-zbgjk\" (UID: \"60335f89-00b6-4100-bb92-7e321aab6731\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:07.883159 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.883089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr9gl\" (UniqueName: \"kubernetes.io/projected/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-kube-api-access-vr9gl\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:07.883159 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.883124 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9bhh\" (UniqueName: \"kubernetes.io/projected/a684f094-f2e9-4f18-b33b-e466f94313d8-kube-api-access-x9bhh\") pod \"node-ca-nzqmg\" (UID: \"a684f094-f2e9-4f18-b33b-e466f94313d8\") " pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:07.884269 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.884249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkgp\" (UniqueName: \"kubernetes.io/projected/f0fd569b-4e3e-4771-8dec-d6f16a52e2b9-kube-api-access-vvkgp\") pod \"multus-additional-cni-plugins-972f2\" (UID: \"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9\") " pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:07.884944 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.884854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxbmq\" (UniqueName: \"kubernetes.io/projected/64e9c497-2c3a-4764-89fd-29dff8b7c4b1-kube-api-access-pxbmq\") pod \"multus-cq6p9\" (UID: \"64e9c497-2c3a-4764-89fd-29dff8b7c4b1\") " pod="openshift-multus/multus-cq6p9" Apr 22 17:53:07.963316 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-kubelet\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963458 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-ovnkube-config\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963458 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-env-overrides\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963458 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-kubelet\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963458 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-slash\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-systemd\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdpv\" (UniqueName: \"kubernetes.io/projected/4abe7788-23bd-436c-bc7c-1de96634aa32-kube-api-access-vjdpv\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-var-lib-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-slash\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-systemd\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-var-lib-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-ovn\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-log-socket\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.963687 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4abe7788-23bd-436c-bc7c-1de96634aa32-ovn-node-metrics-cert\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-ovnkube-script-lib\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-ovn\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-etc-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-systemd-units\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963794 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-etc-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963863 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-systemd-units\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-run-ovn-kubernetes\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-cni-bin\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-run-openvswitch\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963908 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-env-overrides\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-cni-netd\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-log-socket\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-node-log\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-run-ovn-kubernetes\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.963987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-cni-netd\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-cni-bin\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-node-log\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964733 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964733 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-run-netns\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964733 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:07.964733 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964733 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964151 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4abe7788-23bd-436c-bc7c-1de96634aa32-host-run-netns\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964733 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-ovnkube-script-lib\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.964733 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.964636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4abe7788-23bd-436c-bc7c-1de96634aa32-ovnkube-config\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.966056 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.966033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4abe7788-23bd-436c-bc7c-1de96634aa32-ovn-node-metrics-cert\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:07.970146 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.970126 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:07.970304 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.970149 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:07.970304 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.970161 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qd778 for pod openshift-network-diagnostics/network-check-target-4jvnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:07.970304 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:07.970262 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778 podName:eb07fcd6-cc65-437c-9bc0-d210593e3edf nodeName:}" failed. No retries permitted until 2026-04-22 17:53:08.470244617 +0000 UTC m=+3.157705018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qd778" (UniqueName: "kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778") pod "network-check-target-4jvnk" (UID: "eb07fcd6-cc65-437c-9bc0-d210593e3edf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:07.972282 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:07.972265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdpv\" (UniqueName: \"kubernetes.io/projected/4abe7788-23bd-436c-bc7c-1de96634aa32-kube-api-access-vjdpv\") pod \"ovnkube-node-zrmhz\" (UID: \"4abe7788-23bd-436c-bc7c-1de96634aa32\") " pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:08.044055 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.044031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cq6p9" Apr 22 17:53:08.052795 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.052772 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lql7j" Apr 22 17:53:08.061998 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.061978 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" Apr 22 17:53:08.067258 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.067238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-np549" Apr 22 17:53:08.073766 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.073745 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nzqmg" Apr 22 17:53:08.079270 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.079254 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-972f2" Apr 22 17:53:08.084756 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.084737 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:08.090280 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.090263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:08.134178 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.134158 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:08.150564 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.150544 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:08.366616 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.366550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:08.366743 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:08.366650 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:08.366743 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:08.366700 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:09.366683617 +0000 UTC m=+4.054144014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:08.402002 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:08.401979 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b4c353a_3fa1_44c0_954e_74df34b1b224.slice/crio-617db06ff12d7da40d33aa613e37aef79aa41b15feb18498b8a387f8008dbfad WatchSource:0}: Error finding container 617db06ff12d7da40d33aa613e37aef79aa41b15feb18498b8a387f8008dbfad: Status 404 returned error can't find the container with id 617db06ff12d7da40d33aa613e37aef79aa41b15feb18498b8a387f8008dbfad Apr 22 17:53:08.403303 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:08.403274 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60335f89_00b6_4100_bb92_7e321aab6731.slice/crio-b874fca8bf48233b4e710b9b0662a76e807c22a962a169eccb29f8562cd7042b WatchSource:0}: Error finding container b874fca8bf48233b4e710b9b0662a76e807c22a962a169eccb29f8562cd7042b: Status 404 returned error can't find the container with id b874fca8bf48233b4e710b9b0662a76e807c22a962a169eccb29f8562cd7042b Apr 22 17:53:08.404217 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:08.404193 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64e9c497_2c3a_4764_89fd_29dff8b7c4b1.slice/crio-d3a9ef88f328ef4a8563611e57f7c0e1ad2584188788050d8122d3c89bfdadb0 WatchSource:0}: Error finding container d3a9ef88f328ef4a8563611e57f7c0e1ad2584188788050d8122d3c89bfdadb0: Status 404 returned error can't find the container with id d3a9ef88f328ef4a8563611e57f7c0e1ad2584188788050d8122d3c89bfdadb0 Apr 22 17:53:08.408070 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:08.408029 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0fd569b_4e3e_4771_8dec_d6f16a52e2b9.slice/crio-b889087f3f05bf357541ecd88019f4a7cb9168a89b7ab35d037095e861a65a0b WatchSource:0}: Error finding container b889087f3f05bf357541ecd88019f4a7cb9168a89b7ab35d037095e861a65a0b: Status 404 returned error can't find the container with id b889087f3f05bf357541ecd88019f4a7cb9168a89b7ab35d037095e861a65a0b Apr 22 17:53:08.408733 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:08.408712 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa19584_980e_4ca9_a0f2_25e2e3bd0ba6.slice/crio-c0f638758d1414f8e0466e774cf893cd1862911a62134317228cf9f6ea7955b6 WatchSource:0}: Error finding container c0f638758d1414f8e0466e774cf893cd1862911a62134317228cf9f6ea7955b6: Status 404 returned error can't find the container with id c0f638758d1414f8e0466e774cf893cd1862911a62134317228cf9f6ea7955b6 Apr 22 17:53:08.410116 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:08.410053 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abe7788_23bd_436c_bc7c_1de96634aa32.slice/crio-a46ab7f1ff1118885de6638e62d7a08586a4b7846732515c12a942448d7c7338 WatchSource:0}: Error finding container a46ab7f1ff1118885de6638e62d7a08586a4b7846732515c12a942448d7c7338: Status 404 returned error can't find the container with id a46ab7f1ff1118885de6638e62d7a08586a4b7846732515c12a942448d7c7338 Apr 22 17:53:08.411001 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:08.410926 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda684f094_f2e9_4f18_b33b_e466f94313d8.slice/crio-2c0c0aee32abf2ba0f45c56cb93774a58608d9cae646ee8c0374888240886986 WatchSource:0}: Error finding container 2c0c0aee32abf2ba0f45c56cb93774a58608d9cae646ee8c0374888240886986: Status 404 returned error can't find the container with id 2c0c0aee32abf2ba0f45c56cb93774a58608d9cae646ee8c0374888240886986 Apr 22 17:53:08.568110 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.567950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:08.568204 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:08.568086 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:08.568204 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:08.568178 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:08.568204 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:08.568188 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qd778 for pod openshift-network-diagnostics/network-check-target-4jvnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:08.568304 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:08.568244 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778 podName:eb07fcd6-cc65-437c-9bc0-d210593e3edf nodeName:}" failed. No retries permitted until 2026-04-22 17:53:09.568231346 +0000 UTC m=+4.255691730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qd778" (UniqueName: "kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778") pod "network-check-target-4jvnk" (UID: "eb07fcd6-cc65-437c-9bc0-d210593e3edf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:08.787493 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.787414 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:06 +0000 UTC" deadline="2027-10-24 13:43:06.381974601 +0000 UTC" Apr 22 17:53:08.787493 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.787452 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13195h49m57.594526662s" Apr 22 17:53:08.880095 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.880034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nzqmg" event={"ID":"a684f094-f2e9-4f18-b33b-e466f94313d8","Type":"ContainerStarted","Data":"2c0c0aee32abf2ba0f45c56cb93774a58608d9cae646ee8c0374888240886986"} Apr 22 17:53:08.886577 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.886539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lql7j" event={"ID":"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6","Type":"ContainerStarted","Data":"c0f638758d1414f8e0466e774cf893cd1862911a62134317228cf9f6ea7955b6"} Apr 22 17:53:08.890229 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.890169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerStarted","Data":"b889087f3f05bf357541ecd88019f4a7cb9168a89b7ab35d037095e861a65a0b"} Apr 22 17:53:08.900368 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.900327 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cq6p9" event={"ID":"64e9c497-2c3a-4764-89fd-29dff8b7c4b1","Type":"ContainerStarted","Data":"d3a9ef88f328ef4a8563611e57f7c0e1ad2584188788050d8122d3c89bfdadb0"} Apr 22 17:53:08.902844 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.902807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-np549" event={"ID":"5b4c353a-3fa1-44c0-954e-74df34b1b224","Type":"ContainerStarted","Data":"617db06ff12d7da40d33aa613e37aef79aa41b15feb18498b8a387f8008dbfad"} Apr 22 17:53:08.906749 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.906710 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" event={"ID":"017fe086330ddbb9d8db4ca21ed5605a","Type":"ContainerStarted","Data":"7d22cf77b1b70c8d1da71cc953fb6e94adf3a995ff96672dc72c2698ebe62838"} Apr 22 17:53:08.911713 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.911690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lmpbm" event={"ID":"0f0aa83e-c1fb-48e4-b074-4915c38e5138","Type":"ContainerStarted","Data":"446502d5fc7e488ffccf9092aa4dd84a87f487b0d508d010f2937ad8d0e4e8c5"} Apr 22 17:53:08.918123 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.918100 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"a46ab7f1ff1118885de6638e62d7a08586a4b7846732515c12a942448d7c7338"} Apr 22 17:53:08.926756 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:08.926734 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" event={"ID":"60335f89-00b6-4100-bb92-7e321aab6731","Type":"ContainerStarted","Data":"b874fca8bf48233b4e710b9b0662a76e807c22a962a169eccb29f8562cd7042b"} Apr 22 17:53:09.375325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:09.375273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:09.375474 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.375414 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:09.375538 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.375495 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:11.37547569 +0000 UTC m=+6.062936075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:09.577265 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:09.577228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:09.577415 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.577391 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:09.577415 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.577413 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:09.577530 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.577425 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qd778 for pod openshift-network-diagnostics/network-check-target-4jvnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:09.577530 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.577482 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778 podName:eb07fcd6-cc65-437c-9bc0-d210593e3edf nodeName:}" failed. No retries permitted until 2026-04-22 17:53:11.577463718 +0000 UTC m=+6.264924120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qd778" (UniqueName: "kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778") pod "network-check-target-4jvnk" (UID: "eb07fcd6-cc65-437c-9bc0-d210593e3edf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:09.852419 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:09.851668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:09.852419 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.851788 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:09.852419 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:09.852271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:09.852419 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:09.852378 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:09.943738 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:09.943702 2574 generic.go:358] "Generic (PLEG): container finished" podID="a90191fab7fb65406914e2962ac77e93" containerID="29da4430430cd7936522725d86923ff753370dd76b1e0462e1dcfa9370f81246" exitCode=0 Apr 22 17:53:09.944635 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:09.944606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" event={"ID":"a90191fab7fb65406914e2962ac77e93","Type":"ContainerDied","Data":"29da4430430cd7936522725d86923ff753370dd76b1e0462e1dcfa9370f81246"} Apr 22 17:53:09.959663 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:09.959441 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-24.ec2.internal" podStartSLOduration=3.959424781 podStartE2EDuration="3.959424781s" podCreationTimestamp="2026-04-22 17:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:08.920644226 +0000 UTC m=+3.608104638" watchObservedRunningTime="2026-04-22 17:53:09.959424781 +0000 UTC m=+4.646885188" Apr 22 17:53:10.949303 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:10.949268 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" event={"ID":"a90191fab7fb65406914e2962ac77e93","Type":"ContainerStarted","Data":"350526dffeb5bad5b554005ea828b4f1ef8b7b6d6e84b8e2af073ade6aae3ab6"} Apr 22 17:53:11.391363 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:11.391327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:11.391617 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.391573 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:11.391687 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.391635 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:15.391617577 +0000 UTC m=+10.079077968 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:11.593566 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:11.593533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:11.593737 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.593678 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:11.593737 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.593694 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:11.593737 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.593706 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qd778 for pod openshift-network-diagnostics/network-check-target-4jvnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:11.593977 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.593764 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778 podName:eb07fcd6-cc65-437c-9bc0-d210593e3edf nodeName:}" failed. No retries permitted until 2026-04-22 17:53:15.593746565 +0000 UTC m=+10.281206964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qd778" (UniqueName: "kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778") pod "network-check-target-4jvnk" (UID: "eb07fcd6-cc65-437c-9bc0-d210593e3edf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:11.851889 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:11.851858 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:11.852064 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:11.851907 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:11.852064 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.851976 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:11.852177 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:11.852100 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:13.852256 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:13.851764 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:13.852256 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:13.851913 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:13.852256 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:13.851978 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:13.852256 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:13.852088 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:15.421738 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:15.421654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:15.422160 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.421821 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:15.422160 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.421917 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:23.421896278 +0000 UTC m=+18.109356677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:15.624067 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:15.623304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:15.624067 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.623540 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:15.624067 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.623589 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:15.624067 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.623602 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qd778 for pod openshift-network-diagnostics/network-check-target-4jvnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:15.624067 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.623663 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778 podName:eb07fcd6-cc65-437c-9bc0-d210593e3edf nodeName:}" failed. No retries permitted until 2026-04-22 17:53:23.623644463 +0000 UTC m=+18.311104862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qd778" (UniqueName: "kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778") pod "network-check-target-4jvnk" (UID: "eb07fcd6-cc65-437c-9bc0-d210593e3edf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:15.852862 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:15.852610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:15.852862 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.852706 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:15.852862 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:15.852725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:15.852862 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:15.852803 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:17.852112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:17.852077 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:17.852112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:17.852117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:17.852498 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:17.852200 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:17.852498 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:17.852344 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:19.851270 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:19.851239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:19.851667 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:19.851384 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:19.851667 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:19.851425 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:19.851667 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:19.851517 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:20.454310 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.454256 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-24.ec2.internal" podStartSLOduration=14.454242423 podStartE2EDuration="14.454242423s" podCreationTimestamp="2026-04-22 17:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:10.964071503 +0000 UTC m=+5.651531907" watchObservedRunningTime="2026-04-22 17:53:20.454242423 +0000 UTC m=+15.141702827" Apr 22 17:53:20.454466 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.454454 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hbjmn"] Apr 22 17:53:20.458613 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.458592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.461091 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.461067 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:20.461294 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.461262 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:20.461401 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.461303 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mwpxm\"" Apr 22 17:53:20.560969 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.560908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dck5\" (UniqueName: \"kubernetes.io/projected/95190e1c-03c4-4dcf-b739-7c181cb38f82-kube-api-access-7dck5\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.560969 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.560952 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/95190e1c-03c4-4dcf-b739-7c181cb38f82-tmp-dir\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.561133 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.560981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95190e1c-03c4-4dcf-b739-7c181cb38f82-hosts-file\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.662094 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.662062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dck5\" (UniqueName: \"kubernetes.io/projected/95190e1c-03c4-4dcf-b739-7c181cb38f82-kube-api-access-7dck5\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.662249 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.662109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/95190e1c-03c4-4dcf-b739-7c181cb38f82-tmp-dir\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.662249 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.662138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95190e1c-03c4-4dcf-b739-7c181cb38f82-hosts-file\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.662249 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.662213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95190e1c-03c4-4dcf-b739-7c181cb38f82-hosts-file\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.662475 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.662451 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/95190e1c-03c4-4dcf-b739-7c181cb38f82-tmp-dir\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.671525 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.671505 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dck5\" (UniqueName: \"kubernetes.io/projected/95190e1c-03c4-4dcf-b739-7c181cb38f82-kube-api-access-7dck5\") pod \"node-resolver-hbjmn\" (UID: \"95190e1c-03c4-4dcf-b739-7c181cb38f82\") " pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:20.768191 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:20.768152 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hbjmn" Apr 22 17:53:21.851149 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:21.851109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:21.851149 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:21.851140 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:21.851657 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:21.851233 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:21.851657 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:21.851390 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:23.482492 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:23.482460 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:23.482960 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.482607 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:23.482960 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.482678 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:39.482657311 +0000 UTC m=+34.170117698 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:23.684035 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:23.684002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:23.684186 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.684134 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:23.684186 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.684150 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:23.684186 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.684158 2574 projected.go:194] Error preparing data for projected volume kube-api-access-qd778 for pod openshift-network-diagnostics/network-check-target-4jvnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:23.684315 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.684211 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778 podName:eb07fcd6-cc65-437c-9bc0-d210593e3edf nodeName:}" failed. No retries permitted until 2026-04-22 17:53:39.684192229 +0000 UTC m=+34.371652618 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qd778" (UniqueName: "kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778") pod "network-check-target-4jvnk" (UID: "eb07fcd6-cc65-437c-9bc0-d210593e3edf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:23.851521 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:23.851440 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:23.851661 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:23.851440 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:23.851661 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.851571 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:23.851661 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:23.851649 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:24.412982 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:24.412955 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95190e1c_03c4_4dcf_b739_7c181cb38f82.slice/crio-ddf000d90a448bbca2eed9994a2d6e9990a80dad0b63b523be4b54a45afcfc80 WatchSource:0}: Error finding container ddf000d90a448bbca2eed9994a2d6e9990a80dad0b63b523be4b54a45afcfc80: Status 404 returned error can't find the container with id ddf000d90a448bbca2eed9994a2d6e9990a80dad0b63b523be4b54a45afcfc80 Apr 22 17:53:24.970736 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.970484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lmpbm" event={"ID":"0f0aa83e-c1fb-48e4-b074-4915c38e5138","Type":"ContainerStarted","Data":"bad9e97bc3f1d4477678cb60b83bb28ca40601fb940f160536591be404458558"} Apr 22 17:53:24.972018 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.971994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"d356f2de249572014012bc872f4a5eaca7b4ca1104945e5e089a034db7616901"} Apr 22 17:53:24.973413 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.973379 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" event={"ID":"60335f89-00b6-4100-bb92-7e321aab6731","Type":"ContainerStarted","Data":"077ad4a8c975c9dbf3876144fa5d735ac80a1da3623ac0f5da556064cd01dbeb"} Apr 22 17:53:24.974785 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.974750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nzqmg" event={"ID":"a684f094-f2e9-4f18-b33b-e466f94313d8","Type":"ContainerStarted","Data":"ab0d3e5816e8b639cfcdbb700fc41f7d12254736f375772a9f8bc2b404dfabb2"} Apr 22 17:53:24.976205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.976186 2574 generic.go:358] "Generic (PLEG): container finished" podID="f0fd569b-4e3e-4771-8dec-d6f16a52e2b9" containerID="cfcbca6557f660fb756f954068af445cb6af39fb5bb207fa0d90dfe992ed4f76" exitCode=0 Apr 22 17:53:24.976279 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.976221 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerDied","Data":"cfcbca6557f660fb756f954068af445cb6af39fb5bb207fa0d90dfe992ed4f76"} Apr 22 17:53:24.977642 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.977588 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cq6p9" event={"ID":"64e9c497-2c3a-4764-89fd-29dff8b7c4b1","Type":"ContainerStarted","Data":"ea01f2761426a09fe033e78fa5551e8e6be2755506a6d8320c0241238de4c3fc"} Apr 22 17:53:24.980281 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.979664 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-np549" event={"ID":"5b4c353a-3fa1-44c0-954e-74df34b1b224","Type":"ContainerStarted","Data":"71e251ee4390ab8fa6d5d29f92d599041d7503d339b764dbe48ee0c0da9eb255"} Apr 22 17:53:24.982166 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.982142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hbjmn" event={"ID":"95190e1c-03c4-4dcf-b739-7c181cb38f82","Type":"ContainerStarted","Data":"0d2385c8d4eca8f57b506bf9f91362bbfabdd954bb7c40cb81d3adbcffca7904"} Apr 22 17:53:24.982257 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.982171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hbjmn" event={"ID":"95190e1c-03c4-4dcf-b739-7c181cb38f82","Type":"ContainerStarted","Data":"ddf000d90a448bbca2eed9994a2d6e9990a80dad0b63b523be4b54a45afcfc80"} Apr 22 17:53:24.988315 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:24.988271 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lmpbm" podStartSLOduration=3.992619962 podStartE2EDuration="19.988257753s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.413424226 +0000 UTC m=+3.100884614" lastFinishedPulling="2026-04-22 17:53:24.409062011 +0000 UTC m=+19.096522405" observedRunningTime="2026-04-22 17:53:24.987384326 +0000 UTC m=+19.674844725" watchObservedRunningTime="2026-04-22 17:53:24.988257753 +0000 UTC m=+19.675718161" Apr 22 17:53:25.001652 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.001620 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hbjmn" podStartSLOduration=5.001610045 podStartE2EDuration="5.001610045s" podCreationTimestamp="2026-04-22 17:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:25.00136479 +0000 UTC m=+19.688825196" watchObservedRunningTime="2026-04-22 17:53:25.001610045 +0000 UTC m=+19.689070451" Apr 22 17:53:25.023302 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.023165 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nzqmg" podStartSLOduration=4.044320159 podStartE2EDuration="20.023152116s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.413441266 +0000 UTC m=+3.100901651" lastFinishedPulling="2026-04-22 17:53:24.392273206 +0000 UTC m=+19.079733608" observedRunningTime="2026-04-22 17:53:25.022853476 +0000 UTC m=+19.710313882" watchObservedRunningTime="2026-04-22 17:53:25.023152116 +0000 UTC m=+19.710612525" Apr 22 17:53:25.104046 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.103901 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cq6p9" podStartSLOduration=3.891095525 podStartE2EDuration="20.103887518s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.406783317 +0000 UTC m=+3.094243701" lastFinishedPulling="2026-04-22 17:53:24.619575293 +0000 UTC m=+19.307035694" observedRunningTime="2026-04-22 17:53:25.085867512 +0000 UTC m=+19.773327918" watchObservedRunningTime="2026-04-22 17:53:25.103887518 +0000 UTC m=+19.791347924" Apr 22 17:53:25.104351 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.104329 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-np549" podStartSLOduration=4.097130002 podStartE2EDuration="20.104322668s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.405560825 +0000 UTC m=+3.093021224" lastFinishedPulling="2026-04-22 17:53:24.412753494 +0000 UTC m=+19.100213890" observedRunningTime="2026-04-22 17:53:25.104122451 +0000 UTC m=+19.791582849" watchObservedRunningTime="2026-04-22 17:53:25.104322668 +0000 UTC m=+19.791783073" Apr 22 17:53:25.437735 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.437709 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:25.438197 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.438183 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:25.852425 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.852402 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:25.852541 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:25.852499 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:25.852777 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.852642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:25.852777 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:25.852739 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:25.942128 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.942089 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:53:25.985088 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.985059 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lql7j" event={"ID":"2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6","Type":"ContainerStarted","Data":"e0293e48cba2bb5a91fc922d76288c260950a7b090aefcf9beaaa86808de3108"} Apr 22 17:53:25.987732 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.987678 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 17:53:25.988021 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.988001 2574 generic.go:358] "Generic (PLEG): container finished" podID="4abe7788-23bd-436c-bc7c-1de96634aa32" containerID="f583d524ab3e8a68d1786adb5a97053811d3faf31cb0844b4ab37f120e519188" exitCode=1 Apr 22 17:53:25.988113 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.988069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"0e2f89d1358cf305dd3d29819aa3593a5563e931d41fdceb51e50a52b168da7c"} Apr 22 17:53:25.988113 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.988106 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"537ee74e7aeb26058688c3213fad921873292dca27589d6a8e27bb80a47da491"} Apr 22 17:53:25.988222 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.988119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"81bb6dc638dd7c518531aec4b6e77b45b09e5a44ebe30fc2e1e729703a3f57fc"} Apr 22 17:53:25.988222 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.988131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"c0e409caca8677549a90981e3147c63cf4c2d33f7bed547faaad498c7aa63372"} Apr 22 17:53:25.988222 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.988147 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerDied","Data":"f583d524ab3e8a68d1786adb5a97053811d3faf31cb0844b4ab37f120e519188"} Apr 22 17:53:25.989782 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:25.989761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" event={"ID":"60335f89-00b6-4100-bb92-7e321aab6731","Type":"ContainerStarted","Data":"124d39ffb2766b96a706972a837f3570179d7d1f4c39b51c86cd95cf94878017"} Apr 22 17:53:26.000746 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:26.000713 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lql7j" podStartSLOduration=5.002009514 podStartE2EDuration="21.000702197s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.410346458 +0000 UTC m=+3.097806850" lastFinishedPulling="2026-04-22 17:53:24.409039135 +0000 UTC m=+19.096499533" observedRunningTime="2026-04-22 17:53:26.000682234 +0000 UTC m=+20.688142681" watchObservedRunningTime="2026-04-22 17:53:26.000702197 +0000 UTC m=+20.688162622" Apr 22 17:53:26.818978 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:26.818879 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:53:25.942109251Z","UUID":"498f16db-0ba0-4fa0-9b9b-89862053b929","Handler":null,"Name":"","Endpoint":""} Apr 22 17:53:26.821686 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:26.821663 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:53:26.821686 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:26.821689 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:53:26.998101 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:26.997900 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:53:27.851229 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:27.851200 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:27.851438 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:27.851199 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:27.851438 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:27.851326 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:27.851438 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:27.851425 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:27.995091 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:27.995043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" event={"ID":"60335f89-00b6-4100-bb92-7e321aab6731","Type":"ContainerStarted","Data":"7d1cfe5c85d621772c324d2921b18628fc60866015b868aa0797750de9b30833"} Apr 22 17:53:28.013196 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.013171 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jpw95"] Apr 22 17:53:28.028003 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.027967 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zbgjk" podStartSLOduration=4.453855054 podStartE2EDuration="23.027955003s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.40601576 +0000 UTC m=+3.093476158" lastFinishedPulling="2026-04-22 17:53:26.980115721 +0000 UTC m=+21.667576107" observedRunningTime="2026-04-22 17:53:28.027523919 +0000 UTC m=+22.714984329" watchObservedRunningTime="2026-04-22 17:53:28.027955003 +0000 UTC m=+22.715415411" Apr 22 17:53:28.032966 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.032947 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.033043 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:28.033021 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jpw95" podUID="9926e4c9-979f-42d2-a480-34e0d7b96299" Apr 22 17:53:28.117763 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.117741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9926e4c9-979f-42d2-a480-34e0d7b96299-kubelet-config\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.117890 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.117827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9926e4c9-979f-42d2-a480-34e0d7b96299-dbus\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.117890 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.117881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.219162 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.219126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9926e4c9-979f-42d2-a480-34e0d7b96299-kubelet-config\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.219291 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.219192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9926e4c9-979f-42d2-a480-34e0d7b96299-dbus\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.219291 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.219229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.219291 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.219248 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9926e4c9-979f-42d2-a480-34e0d7b96299-kubelet-config\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.219435 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:28.219344 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:28.219435 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:28.219411 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret podName:9926e4c9-979f-42d2-a480-34e0d7b96299 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:28.719394135 +0000 UTC m=+23.406854532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret") pod "global-pull-secret-syncer-jpw95" (UID: "9926e4c9-979f-42d2-a480-34e0d7b96299") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:28.219528 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.219510 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9926e4c9-979f-42d2-a480-34e0d7b96299-dbus\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.723484 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:28.723449 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:28.723638 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:28.723616 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:28.723689 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:28.723681 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret podName:9926e4c9-979f-42d2-a480-34e0d7b96299 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:29.723666391 +0000 UTC m=+24.411126775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret") pod "global-pull-secret-syncer-jpw95" (UID: "9926e4c9-979f-42d2-a480-34e0d7b96299") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:29.732342 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:29.732316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:29.732880 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:29.732432 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:29.732880 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:29.732479 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret podName:9926e4c9-979f-42d2-a480-34e0d7b96299 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:31.732466801 +0000 UTC m=+26.419927186 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret") pod "global-pull-secret-syncer-jpw95" (UID: "9926e4c9-979f-42d2-a480-34e0d7b96299") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:29.851074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:29.851042 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:29.851191 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:29.851042 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:29.851191 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:29.851139 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:29.851291 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:29.851210 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jpw95" podUID="9926e4c9-979f-42d2-a480-34e0d7b96299" Apr 22 17:53:29.851291 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:29.851042 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:29.851400 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:29.851289 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:29.999191 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:29.999139 2574 generic.go:358] "Generic (PLEG): container finished" podID="f0fd569b-4e3e-4771-8dec-d6f16a52e2b9" containerID="92712381eb548eb71a9db2fa06af606894f832258db3d1ec0587240af45ad4a1" exitCode=0 Apr 22 17:53:29.999281 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:29.999217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerDied","Data":"92712381eb548eb71a9db2fa06af606894f832258db3d1ec0587240af45ad4a1"} Apr 22 17:53:30.002063 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:30.002041 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 17:53:30.004505 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:30.004481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"3ccf33e41a21586099f08ac6e872e6f6cd7b20dad845cb540e749c323696992e"} Apr 22 17:53:31.008013 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:31.007988 2574 generic.go:358] "Generic (PLEG): container finished" podID="f0fd569b-4e3e-4771-8dec-d6f16a52e2b9" containerID="6f004301fe1c3f10dca8d10757f17367835caf459e912e1a488743f1e80cdc4c" exitCode=0 Apr 22 17:53:31.008297 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:31.008044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerDied","Data":"6f004301fe1c3f10dca8d10757f17367835caf459e912e1a488743f1e80cdc4c"} Apr 22 17:53:31.748493 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:31.748329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:31.748573 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:31.748459 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:31.748573 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:31.748563 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret podName:9926e4c9-979f-42d2-a480-34e0d7b96299 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:35.748548906 +0000 UTC m=+30.436009290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret") pod "global-pull-secret-syncer-jpw95" (UID: "9926e4c9-979f-42d2-a480-34e0d7b96299") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:31.851431 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:31.851409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:31.851431 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:31.851424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:31.851546 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:31.851410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:31.851546 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:31.851485 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:31.851616 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:31.851573 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jpw95" podUID="9926e4c9-979f-42d2-a480-34e0d7b96299" Apr 22 17:53:31.851646 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:31.851630 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:32.012318 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.012300 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 17:53:32.012621 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.012602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"d2f10e7a1092185e96ce1fe0d2702bf878ee33e453216fdb8f400bbefed4e084"} Apr 22 17:53:32.012923 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.012905 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:32.013009 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.012930 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:32.013009 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.012943 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:32.013083 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.013038 2574 scope.go:117] "RemoveContainer" containerID="f583d524ab3e8a68d1786adb5a97053811d3faf31cb0844b4ab37f120e519188" Apr 22 17:53:32.014894 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.014871 2574 generic.go:358] "Generic (PLEG): container finished" podID="f0fd569b-4e3e-4771-8dec-d6f16a52e2b9" containerID="2592223cff93c27685d47b8ec19017687825ffff8c3c06c98ebd4672e44caf23" exitCode=0 Apr 22 17:53:32.014970 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.014913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerDied","Data":"2592223cff93c27685d47b8ec19017687825ffff8c3c06c98ebd4672e44caf23"} Apr 22 17:53:32.027448 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.027431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:32.027592 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:32.027576 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:53:33.021720 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.021690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 17:53:33.022337 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.022274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" event={"ID":"4abe7788-23bd-436c-bc7c-1de96634aa32","Type":"ContainerStarted","Data":"205a364dfbc396f4db1a93f605607e9bc7974856cb7669603cae69d88136dbac"} Apr 22 17:53:33.052953 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.052246 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" podStartSLOduration=10.682842079 podStartE2EDuration="27.05222758s" podCreationTimestamp="2026-04-22 17:53:06 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.412256276 +0000 UTC m=+3.099716664" lastFinishedPulling="2026-04-22 17:53:24.781641778 +0000 UTC m=+19.469102165" observedRunningTime="2026-04-22 17:53:33.051031671 +0000 UTC m=+27.738492110" watchObservedRunningTime="2026-04-22 17:53:33.05222758 +0000 UTC m=+27.739687978" Apr 22 17:53:33.262702 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.262671 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4jvnk"] Apr 22 17:53:33.262883 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.262794 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:33.263084 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:33.262924 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:33.263084 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.263041 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fztfm"] Apr 22 17:53:33.263209 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.263153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:33.263274 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:33.263246 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:33.272681 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.272237 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jpw95"] Apr 22 17:53:33.272681 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:33.272335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:33.272681 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:33.272406 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jpw95" podUID="9926e4c9-979f-42d2-a480-34e0d7b96299" Apr 22 17:53:34.851251 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:34.851052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:34.851673 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:34.851052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:34.851673 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:34.851359 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:34.851673 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:34.851082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:34.851673 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:34.851446 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jpw95" podUID="9926e4c9-979f-42d2-a480-34e0d7b96299" Apr 22 17:53:34.851673 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:34.851491 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:35.782644 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:35.782611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:35.782847 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:35.782809 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:35.782923 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:35.782912 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret podName:9926e4c9-979f-42d2-a480-34e0d7b96299 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:43.782888686 +0000 UTC m=+38.470349083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret") pod "global-pull-secret-syncer-jpw95" (UID: "9926e4c9-979f-42d2-a480-34e0d7b96299") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:36.674082 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:36.674044 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:36.674635 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:36.674201 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:53:36.674704 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:36.674647 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lmpbm" Apr 22 17:53:36.852057 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:36.852031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:36.852057 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:36.852049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:36.852286 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:36.852031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:36.852286 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:36.852133 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jpw95" podUID="9926e4c9-979f-42d2-a480-34e0d7b96299" Apr 22 17:53:36.852286 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:36.852206 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:53:36.852452 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:36.852306 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4jvnk" podUID="eb07fcd6-cc65-437c-9bc0-d210593e3edf" Apr 22 17:53:37.149213 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.149187 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-24.ec2.internal" event="NodeReady" Apr 22 17:53:37.149375 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.149326 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:53:37.188497 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.188470 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5z57j"] Apr 22 17:53:37.223887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.223850 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-85lcr"] Apr 22 17:53:37.224010 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.223906 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.226265 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.226196 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:53:37.226362 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.226330 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:53:37.226443 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.226424 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5nxd\"" Apr 22 17:53:37.239176 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.239155 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5z57j"] Apr 22 17:53:37.239277 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.239186 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-85lcr"] Apr 22 17:53:37.241894 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.239396 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:37.243438 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.243260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:53:37.243438 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.243296 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:53:37.243438 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.243435 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-frn4n\"" Apr 22 17:53:37.243620 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.243521 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:53:37.394197 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.394105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c1f8e2-48fe-42d1-bfbc-436a196841e4-config-volume\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.394197 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.394182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.394411 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.394222 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65xr\" (UniqueName: \"kubernetes.io/projected/0a435446-c735-4a7b-bbb9-eab6af3f7b77-kube-api-access-f65xr\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:37.394411 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.394264 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:37.394411 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.394315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18c1f8e2-48fe-42d1-bfbc-436a196841e4-tmp-dir\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.394411 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.394335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsqff\" (UniqueName: \"kubernetes.io/projected/18c1f8e2-48fe-42d1-bfbc-436a196841e4-kube-api-access-hsqff\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.495004 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.494969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.495177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.495022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f65xr\" (UniqueName: \"kubernetes.io/projected/0a435446-c735-4a7b-bbb9-eab6af3f7b77-kube-api-access-f65xr\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:37.495177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.495048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:37.495177 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.495048 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:37.495177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.495091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18c1f8e2-48fe-42d1-bfbc-436a196841e4-tmp-dir\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.495177 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.495117 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:37.995096082 +0000 UTC m=+32.682556469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:53:37.495177 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.495159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsqff\" (UniqueName: \"kubernetes.io/projected/18c1f8e2-48fe-42d1-bfbc-436a196841e4-kube-api-access-hsqff\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.495177 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.495169 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:37.495515 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.495218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c1f8e2-48fe-42d1-bfbc-436a196841e4-config-volume\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.495515 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.495236 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:37.995220156 +0000 UTC m=+32.682680560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:53:37.495515 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.495316 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18c1f8e2-48fe-42d1-bfbc-436a196841e4-tmp-dir\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.495751 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.495729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c1f8e2-48fe-42d1-bfbc-436a196841e4-config-volume\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.505224 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.505201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsqff\" (UniqueName: \"kubernetes.io/projected/18c1f8e2-48fe-42d1-bfbc-436a196841e4-kube-api-access-hsqff\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.505325 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.505249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65xr\" (UniqueName: \"kubernetes.io/projected/0a435446-c735-4a7b-bbb9-eab6af3f7b77-kube-api-access-f65xr\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:37.998234 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.998211 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:37.998727 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:37.998251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:37.998727 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.998347 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:37.998727 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.998361 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:37.998727 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.998399 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:38.998385091 +0000 UTC m=+33.685845475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:53:37.998727 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:37.998415 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:38.998408976 +0000 UTC m=+33.685869360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:53:38.034010 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.033972 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerStarted","Data":"837569c84d731f1ae7ce6f8e59b0976b23263f6dcc4c0a9100c35d33cb73fb34"} Apr 22 17:53:38.851817 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.851786 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:38.851817 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.851824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:38.852105 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.851846 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:38.854749 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.854729 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:53:38.854861 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.854730 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:53:38.856138 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.856123 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:53:38.856239 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.856206 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qsx25\"" Apr 22 17:53:38.856304 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.856293 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jtfq9\"" Apr 22 17:53:38.856356 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:38.856329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:53:39.007814 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.007794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:39.008134 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.007823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:39.008134 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:39.007956 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:39.008134 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:39.008031 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:39.008134 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:39.008089 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.008056684 +0000 UTC m=+35.695517068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:53:39.008134 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:39.008107 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.008101173 +0000 UTC m=+35.695561557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:53:39.038339 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.038311 2574 generic.go:358] "Generic (PLEG): container finished" podID="f0fd569b-4e3e-4771-8dec-d6f16a52e2b9" containerID="837569c84d731f1ae7ce6f8e59b0976b23263f6dcc4c0a9100c35d33cb73fb34" exitCode=0 Apr 22 17:53:39.038419 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.038350 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerDied","Data":"837569c84d731f1ae7ce6f8e59b0976b23263f6dcc4c0a9100c35d33cb73fb34"} Apr 22 17:53:39.511695 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.511668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:53:39.511811 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:39.511790 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:53:39.511870 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:39.511860 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:11.511827939 +0000 UTC m=+66.199288325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : secret "metrics-daemon-secret" not found Apr 22 17:53:39.715954 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.713517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:39.716882 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.716862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd778\" (UniqueName: \"kubernetes.io/projected/eb07fcd6-cc65-437c-9bc0-d210593e3edf-kube-api-access-qd778\") pod \"network-check-target-4jvnk\" (UID: \"eb07fcd6-cc65-437c-9bc0-d210593e3edf\") " pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:39.761423 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.761393 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:39.946565 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:39.946407 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4jvnk"] Apr 22 17:53:39.950682 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:39.950649 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb07fcd6_cc65_437c_9bc0_d210593e3edf.slice/crio-beb57341bc06af374a409f2496ad915b060e0e48b6b056d68656091f5701e179 WatchSource:0}: Error finding container beb57341bc06af374a409f2496ad915b060e0e48b6b056d68656091f5701e179: Status 404 returned error can't find the container with id beb57341bc06af374a409f2496ad915b060e0e48b6b056d68656091f5701e179 Apr 22 17:53:40.042536 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:40.042457 2574 generic.go:358] "Generic (PLEG): container finished" podID="f0fd569b-4e3e-4771-8dec-d6f16a52e2b9" containerID="73a1f070a3b6ac384d68f923c642246d3f04a0c0ea62da7efc2a271e0fcdee3b" exitCode=0 Apr 22 17:53:40.042905 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:40.042536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerDied","Data":"73a1f070a3b6ac384d68f923c642246d3f04a0c0ea62da7efc2a271e0fcdee3b"} Apr 22 17:53:40.043563 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:40.043541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4jvnk" event={"ID":"eb07fcd6-cc65-437c-9bc0-d210593e3edf","Type":"ContainerStarted","Data":"beb57341bc06af374a409f2496ad915b060e0e48b6b056d68656091f5701e179"} Apr 22 17:53:41.024552 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:41.024517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:41.024552 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:41.024561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:41.024776 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:41.024671 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:41.024776 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:41.024734 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:41.024776 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:41.024739 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:45.024720266 +0000 UTC m=+39.712180670 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:53:41.024952 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:41.024790 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:45.024762261 +0000 UTC m=+39.712222647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:53:41.048247 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:41.048223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-972f2" event={"ID":"f0fd569b-4e3e-4771-8dec-d6f16a52e2b9","Type":"ContainerStarted","Data":"208c2f0f7e0d55d34ae3440ff52d37905fc8285f7b39f442b26eaaf2c53ac3b6"} Apr 22 17:53:41.075555 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:41.075513 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-972f2" podStartSLOduration=6.638263832 podStartE2EDuration="36.07550027s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:53:08.409679417 +0000 UTC m=+3.097139801" lastFinishedPulling="2026-04-22 17:53:37.846915851 +0000 UTC m=+32.534376239" observedRunningTime="2026-04-22 17:53:41.074143958 +0000 UTC m=+35.761604364" watchObservedRunningTime="2026-04-22 17:53:41.07550027 +0000 UTC m=+35.762960675" Apr 22 17:53:43.848595 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:43.848558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:43.854372 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:43.854342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9926e4c9-979f-42d2-a480-34e0d7b96299-original-pull-secret\") pod \"global-pull-secret-syncer-jpw95\" (UID: \"9926e4c9-979f-42d2-a480-34e0d7b96299\") " pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:43.971378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:43.971349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jpw95" Apr 22 17:53:44.056335 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:44.056283 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4jvnk" event={"ID":"eb07fcd6-cc65-437c-9bc0-d210593e3edf","Type":"ContainerStarted","Data":"d608529a8a64d3ad391301baa6ff35b9f76a526d2c00bdbd3fd55a61d054d1df"} Apr 22 17:53:44.056465 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:44.056425 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:53:44.074207 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:44.074171 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4jvnk" podStartSLOduration=34.740652768 podStartE2EDuration="38.074159927s" podCreationTimestamp="2026-04-22 17:53:06 +0000 UTC" firstStartedPulling="2026-04-22 17:53:39.952451678 +0000 UTC m=+34.639912062" lastFinishedPulling="2026-04-22 17:53:43.285958834 +0000 UTC m=+37.973419221" observedRunningTime="2026-04-22 17:53:44.073002318 +0000 UTC m=+38.760462728" watchObservedRunningTime="2026-04-22 17:53:44.074159927 +0000 UTC m=+38.761620333" Apr 22 17:53:44.090229 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:44.090202 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jpw95"] Apr 22 17:53:44.093727 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:53:44.093702 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9926e4c9_979f_42d2_a480_34e0d7b96299.slice/crio-24364e4d759559c22fc44bb16003f14c20e35ec96200919b9d25dc15b655c64e WatchSource:0}: Error finding container 24364e4d759559c22fc44bb16003f14c20e35ec96200919b9d25dc15b655c64e: Status 404 returned error can't find the container with id 24364e4d759559c22fc44bb16003f14c20e35ec96200919b9d25dc15b655c64e Apr 22 17:53:45.057110 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:45.057067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:45.057575 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:45.057122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:45.057575 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:45.057228 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:45.057575 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:45.057281 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:53.057266341 +0000 UTC m=+47.744726729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:53:45.057575 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:45.057228 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:45.057575 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:45.057345 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:53.057333506 +0000 UTC m=+47.744793891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:53:45.059643 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:45.059611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jpw95" event={"ID":"9926e4c9-979f-42d2-a480-34e0d7b96299","Type":"ContainerStarted","Data":"24364e4d759559c22fc44bb16003f14c20e35ec96200919b9d25dc15b655c64e"} Apr 22 17:53:48.066536 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:48.066492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jpw95" event={"ID":"9926e4c9-979f-42d2-a480-34e0d7b96299","Type":"ContainerStarted","Data":"c20ef6de3f04392ec7e31c8725bd0a3c4c84a36410bb92acc9d2036da96f362d"} Apr 22 17:53:48.083055 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:48.083007 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jpw95" podStartSLOduration=17.493102842 podStartE2EDuration="21.082993875s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:44.095271675 +0000 UTC m=+38.782732060" lastFinishedPulling="2026-04-22 17:53:47.685162706 +0000 UTC m=+42.372623093" observedRunningTime="2026-04-22 17:53:48.082199447 +0000 UTC m=+42.769659853" watchObservedRunningTime="2026-04-22 17:53:48.082993875 +0000 UTC m=+42.770454311" Apr 22 17:53:53.109212 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:53.109173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:53:53.109212 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:53:53.109215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:53:53.109674 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:53.109320 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:53.109674 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:53.109323 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:53.109674 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:53.109382 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:09.109365914 +0000 UTC m=+63.796826300 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:53:53.109674 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:53:53.109395 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:09.109388933 +0000 UTC m=+63.796849318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:54:04.034841 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:54:04.034811 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zrmhz" Apr 22 17:54:09.117321 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:54:09.117281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:54:09.117321 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:54:09.117324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:54:09.117768 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:09.117429 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:09.117768 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:09.117434 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:09.117768 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:09.117482 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.117465733 +0000 UTC m=+95.804926118 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:54:09.117768 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:09.117504 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.117488234 +0000 UTC m=+95.804948619 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:54:11.530425 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:54:11.530383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:54:11.530797 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:11.530529 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:54:11.530797 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:11.530596 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:15.530579956 +0000 UTC m=+130.218040340 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : secret "metrics-daemon-secret" not found Apr 22 17:54:15.062297 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:54:15.062193 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4jvnk" Apr 22 17:54:41.127591 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:54:41.127555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:54:41.127591 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:54:41.127595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:54:41.128102 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:41.127704 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:41.128102 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:41.127705 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:41.128102 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:41.127761 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:45.127746134 +0000 UTC m=+159.815206519 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:54:41.128102 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:54:41.127773 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls podName:18c1f8e2-48fe-42d1-bfbc-436a196841e4 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:45.127767511 +0000 UTC m=+159.815227895 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls") pod "dns-default-5z57j" (UID: "18c1f8e2-48fe-42d1-bfbc-436a196841e4") : secret "dns-default-metrics-tls" not found Apr 22 17:55:15.546290 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:15.546239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:55:15.546822 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:15.546374 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:55:15.546822 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:15.546800 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs podName:027c3a56-b141-4f0e-beda-4bbc2fdc45c6 nodeName:}" failed. No retries permitted until 2026-04-22 17:57:17.546775593 +0000 UTC m=+252.234235993 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs") pod "network-metrics-daemon-fztfm" (UID: "027c3a56-b141-4f0e-beda-4bbc2fdc45c6") : secret "metrics-daemon-secret" not found Apr 22 17:55:33.644561 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.644520 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj"] Apr 22 17:55:33.647350 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.647330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.651069 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.651050 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 17:55:33.651306 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.651291 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.651400 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.651380 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 17:55:33.651443 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.651416 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.652779 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.652763 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jz965\"" Apr 22 17:55:33.656129 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.656110 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj"] Apr 22 17:55:33.746502 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.746467 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn"] Apr 22 17:55:33.749138 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.749117 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst"] Apr 22 17:55:33.749272 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.749254 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.751634 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.751612 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5544d4d987-rql9c"] Apr 22 17:55:33.751754 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.751735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" Apr 22 17:55:33.752234 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.752208 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.752385 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.752370 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 17:55:33.752515 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.752483 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-g6646\"" Apr 22 17:55:33.752582 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.752521 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.752582 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.752561 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 17:55:33.753936 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.753914 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mg4ld\"" Apr 22 17:55:33.754242 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.754220 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.754361 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.754344 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.755010 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.754990 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.760798 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.760528 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:55:33.760798 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.760587 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gd422\"" Apr 22 17:55:33.760798 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.760774 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:55:33.761774 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.761756 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:55:33.763156 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.763136 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst"] Apr 22 17:55:33.765982 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.765962 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn"] Apr 22 17:55:33.766224 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.766176 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:55:33.769200 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.769182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85d7429-1a02-4051-b96d-692a3bc3bacc-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.769262 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.769208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djhzn\" (UniqueName: \"kubernetes.io/projected/b85d7429-1a02-4051-b96d-692a3bc3bacc-kube-api-access-djhzn\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.769262 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.769231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d7429-1a02-4051-b96d-692a3bc3bacc-config\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.778589 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.778570 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5544d4d987-rql9c"] Apr 22 17:55:33.869774 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djhzn\" (UniqueName: \"kubernetes.io/projected/b85d7429-1a02-4051-b96d-692a3bc3bacc-kube-api-access-djhzn\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.869774 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d7429-1a02-4051-b96d-692a3bc3bacc-config\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.869990 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869809 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-registry-certificates\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.869990 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-installation-pull-secrets\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.869990 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869858 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-bound-sa-token\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.869990 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.869990 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-trusted-ca\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.869990 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.869988 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.870205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870018 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e43894b-483d-49fb-a577-da6aec0956f0-ca-trust-extracted\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.870205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.870205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtnz\" (UniqueName: \"kubernetes.io/projected/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-kube-api-access-kqtnz\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.870205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-image-registry-private-configuration\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.870205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85d7429-1a02-4051-b96d-692a3bc3bacc-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.870205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npktp\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-kube-api-access-npktp\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.870205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jssh\" (UniqueName: \"kubernetes.io/projected/7e087626-eed9-4af5-a0e7-65ed53ddb4a0-kube-api-access-6jssh\") pod \"volume-data-source-validator-7c6cbb6c87-6lkst\" (UID: \"7e087626-eed9-4af5-a0e7-65ed53ddb4a0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" Apr 22 17:55:33.870995 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.870976 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d7429-1a02-4051-b96d-692a3bc3bacc-config\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.872220 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.872202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85d7429-1a02-4051-b96d-692a3bc3bacc-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.881119 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.881098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djhzn\" (UniqueName: \"kubernetes.io/projected/b85d7429-1a02-4051-b96d-692a3bc3bacc-kube-api-access-djhzn\") pod \"service-ca-operator-d6fc45fc5-x4jsj\" (UID: \"b85d7429-1a02-4051-b96d-692a3bc3bacc\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.955529 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.955444 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" Apr 22 17:55:33.971395 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-trusted-ca\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.971498 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.971498 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e43894b-483d-49fb-a577-da6aec0956f0-ca-trust-extracted\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.971607 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:33.971592 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:55:33.971650 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:33.971610 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5544d4d987-rql9c: secret "image-registry-tls" not found Apr 22 17:55:33.971692 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:33.971679 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls podName:3e43894b-483d-49fb-a577-da6aec0956f0 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:34.471662472 +0000 UTC m=+149.159122871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls") pod "image-registry-5544d4d987-rql9c" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0") : secret "image-registry-tls" not found Apr 22 17:55:33.971747 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.971798 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtnz\" (UniqueName: \"kubernetes.io/projected/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-kube-api-access-kqtnz\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.971868 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-image-registry-private-configuration\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.971932 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npktp\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-kube-api-access-npktp\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.971932 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jssh\" (UniqueName: \"kubernetes.io/projected/7e087626-eed9-4af5-a0e7-65ed53ddb4a0-kube-api-access-6jssh\") pod \"volume-data-source-validator-7c6cbb6c87-6lkst\" (UID: \"7e087626-eed9-4af5-a0e7-65ed53ddb4a0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" Apr 22 17:55:33.972049 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.971983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-registry-certificates\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.972049 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.972010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e43894b-483d-49fb-a577-da6aec0956f0-ca-trust-extracted\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.972049 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.972011 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-installation-pull-secrets\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.972049 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.972056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-bound-sa-token\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.972283 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.972084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.972342 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.972306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.972771 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.972750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-trusted-ca\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.972892 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.972812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-registry-certificates\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.974617 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.974596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.974709 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.974678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-image-registry-private-configuration\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.974709 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.974688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-installation-pull-secrets\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.982112 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.982087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtnz\" (UniqueName: \"kubernetes.io/projected/3c91290f-1a67-4f2b-bb75-f6e0647e34d5-kube-api-access-kqtnz\") pod \"kube-storage-version-migrator-operator-6769c5d45-82zvn\" (UID: \"3c91290f-1a67-4f2b-bb75-f6e0647e34d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:33.982384 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.982365 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jssh\" (UniqueName: \"kubernetes.io/projected/7e087626-eed9-4af5-a0e7-65ed53ddb4a0-kube-api-access-6jssh\") pod \"volume-data-source-validator-7c6cbb6c87-6lkst\" (UID: \"7e087626-eed9-4af5-a0e7-65ed53ddb4a0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" Apr 22 17:55:33.983099 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.983079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-bound-sa-token\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:33.983163 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:33.983148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npktp\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-kube-api-access-npktp\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:34.064205 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.064172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" Apr 22 17:55:34.070001 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.069977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" Apr 22 17:55:34.072717 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.072686 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj"] Apr 22 17:55:34.076108 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:34.076080 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85d7429_1a02_4051_b96d_692a3bc3bacc.slice/crio-d5a5943aadc52596deb2b437e55bbb15184e7650cdd8d4e9fb1a0d79706e9d08 WatchSource:0}: Error finding container d5a5943aadc52596deb2b437e55bbb15184e7650cdd8d4e9fb1a0d79706e9d08: Status 404 returned error can't find the container with id d5a5943aadc52596deb2b437e55bbb15184e7650cdd8d4e9fb1a0d79706e9d08 Apr 22 17:55:34.195912 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.195884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn"] Apr 22 17:55:34.198941 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:34.198914 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c91290f_1a67_4f2b_bb75_f6e0647e34d5.slice/crio-d83fb4bd0aac221e6c05834722207ff41779a602cf1ddc9a481c2ebc96dd3a29 WatchSource:0}: Error finding container d83fb4bd0aac221e6c05834722207ff41779a602cf1ddc9a481c2ebc96dd3a29: Status 404 returned error can't find the container with id d83fb4bd0aac221e6c05834722207ff41779a602cf1ddc9a481c2ebc96dd3a29 Apr 22 17:55:34.209005 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.208960 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst"] Apr 22 17:55:34.212275 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:34.212253 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e087626_eed9_4af5_a0e7_65ed53ddb4a0.slice/crio-5481e0bb5b7a7c059c85d2b9a4844322f8860c1b5e4de02879950cb83fb9d649 WatchSource:0}: Error finding container 5481e0bb5b7a7c059c85d2b9a4844322f8860c1b5e4de02879950cb83fb9d649: Status 404 returned error can't find the container with id 5481e0bb5b7a7c059c85d2b9a4844322f8860c1b5e4de02879950cb83fb9d649 Apr 22 17:55:34.217783 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.217764 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8"] Apr 22 17:55:34.222334 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.222317 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" Apr 22 17:55:34.224719 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.224702 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-t7c2v\"" Apr 22 17:55:34.227753 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.227732 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8"] Apr 22 17:55:34.265608 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.265577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" event={"ID":"3c91290f-1a67-4f2b-bb75-f6e0647e34d5","Type":"ContainerStarted","Data":"d83fb4bd0aac221e6c05834722207ff41779a602cf1ddc9a481c2ebc96dd3a29"} Apr 22 17:55:34.266479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.266456 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" event={"ID":"7e087626-eed9-4af5-a0e7-65ed53ddb4a0","Type":"ContainerStarted","Data":"5481e0bb5b7a7c059c85d2b9a4844322f8860c1b5e4de02879950cb83fb9d649"} Apr 22 17:55:34.267308 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.267289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" event={"ID":"b85d7429-1a02-4051-b96d-692a3bc3bacc","Type":"ContainerStarted","Data":"d5a5943aadc52596deb2b437e55bbb15184e7650cdd8d4e9fb1a0d79706e9d08"} Apr 22 17:55:34.375602 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.375564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpbnc\" (UniqueName: \"kubernetes.io/projected/b3c615fe-9b43-49f7-b16b-8bb8c1710870-kube-api-access-qpbnc\") pod \"network-check-source-8894fc9bd-5t9k8\" (UID: \"b3c615fe-9b43-49f7-b16b-8bb8c1710870\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" Apr 22 17:55:34.476050 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.476019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpbnc\" (UniqueName: \"kubernetes.io/projected/b3c615fe-9b43-49f7-b16b-8bb8c1710870-kube-api-access-qpbnc\") pod \"network-check-source-8894fc9bd-5t9k8\" (UID: \"b3c615fe-9b43-49f7-b16b-8bb8c1710870\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" Apr 22 17:55:34.476195 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.476080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:34.476195 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:34.476160 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:55:34.476195 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:34.476172 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5544d4d987-rql9c: secret "image-registry-tls" not found Apr 22 17:55:34.476291 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:34.476218 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls podName:3e43894b-483d-49fb-a577-da6aec0956f0 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:35.47620383 +0000 UTC m=+150.163664215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls") pod "image-registry-5544d4d987-rql9c" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0") : secret "image-registry-tls" not found Apr 22 17:55:34.484860 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.484815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpbnc\" (UniqueName: \"kubernetes.io/projected/b3c615fe-9b43-49f7-b16b-8bb8c1710870-kube-api-access-qpbnc\") pod \"network-check-source-8894fc9bd-5t9k8\" (UID: \"b3c615fe-9b43-49f7-b16b-8bb8c1710870\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" Apr 22 17:55:34.531391 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.531342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" Apr 22 17:55:34.648134 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:34.648104 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8"] Apr 22 17:55:34.651196 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:34.651166 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c615fe_9b43_49f7_b16b_8bb8c1710870.slice/crio-8b934ef7b829abc697a9d5669cf3321fbd6772f3128dd60062ffa5c4b868ef4a WatchSource:0}: Error finding container 8b934ef7b829abc697a9d5669cf3321fbd6772f3128dd60062ffa5c4b868ef4a: Status 404 returned error can't find the container with id 8b934ef7b829abc697a9d5669cf3321fbd6772f3128dd60062ffa5c4b868ef4a Apr 22 17:55:35.273449 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:35.272800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" event={"ID":"b3c615fe-9b43-49f7-b16b-8bb8c1710870","Type":"ContainerStarted","Data":"b53e827a70115b5c3fa5fc300926efd77ddbac2d4bdd8ab9dc8a4421b9b6431f"} Apr 22 17:55:35.273449 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:35.272860 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" event={"ID":"b3c615fe-9b43-49f7-b16b-8bb8c1710870","Type":"ContainerStarted","Data":"8b934ef7b829abc697a9d5669cf3321fbd6772f3128dd60062ffa5c4b868ef4a"} Apr 22 17:55:35.291895 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:35.291822 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5t9k8" podStartSLOduration=1.29180339 podStartE2EDuration="1.29180339s" podCreationTimestamp="2026-04-22 17:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:35.291113769 +0000 UTC m=+149.978574189" watchObservedRunningTime="2026-04-22 17:55:35.29180339 +0000 UTC m=+149.979263796" Apr 22 17:55:35.487328 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:35.486644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:35.487328 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:35.486823 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:55:35.487328 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:35.486856 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5544d4d987-rql9c: secret "image-registry-tls" not found Apr 22 17:55:35.487328 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:35.486915 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls podName:3e43894b-483d-49fb-a577-da6aec0956f0 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:37.486896309 +0000 UTC m=+152.174356699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls") pod "image-registry-5544d4d987-rql9c" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0") : secret "image-registry-tls" not found Apr 22 17:55:37.280023 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:37.279985 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" event={"ID":"b85d7429-1a02-4051-b96d-692a3bc3bacc","Type":"ContainerStarted","Data":"b70caa65f1494ab2ae19543328d4bf783f548006c4b0c4be2e335e842e30ec0e"} Apr 22 17:55:37.281485 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:37.281452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" event={"ID":"3c91290f-1a67-4f2b-bb75-f6e0647e34d5","Type":"ContainerStarted","Data":"2f3ca601215b8da188281a1baf46c5eec6320f9f24b84a8e0131c5d37a9621bc"} Apr 22 17:55:37.282789 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:37.282762 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" event={"ID":"7e087626-eed9-4af5-a0e7-65ed53ddb4a0","Type":"ContainerStarted","Data":"8e5d7489dd97ed4c2afa4e3d12c493d49d24c71ccc4783b0921281d713fcadb4"} Apr 22 17:55:37.296817 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:37.296768 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" podStartSLOduration=1.671422424 podStartE2EDuration="4.296756659s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="2026-04-22 17:55:34.077914268 +0000 UTC m=+148.765374652" lastFinishedPulling="2026-04-22 17:55:36.703248498 +0000 UTC m=+151.390708887" observedRunningTime="2026-04-22 17:55:37.295244279 +0000 UTC m=+151.982704687" watchObservedRunningTime="2026-04-22 17:55:37.296756659 +0000 UTC m=+151.984217066" Apr 22 17:55:37.309584 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:37.309545 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6lkst" podStartSLOduration=1.823194327 podStartE2EDuration="4.309532452s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="2026-04-22 17:55:34.213950857 +0000 UTC m=+148.901411245" lastFinishedPulling="2026-04-22 17:55:36.700288985 +0000 UTC m=+151.387749370" observedRunningTime="2026-04-22 17:55:37.309334228 +0000 UTC m=+151.996794636" watchObservedRunningTime="2026-04-22 17:55:37.309532452 +0000 UTC m=+151.996992862" Apr 22 17:55:37.328040 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:37.327987 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" podStartSLOduration=1.818153336 podStartE2EDuration="4.327978906s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="2026-04-22 17:55:34.200670833 +0000 UTC m=+148.888131218" lastFinishedPulling="2026-04-22 17:55:36.710496401 +0000 UTC m=+151.397956788" observedRunningTime="2026-04-22 17:55:37.327502479 +0000 UTC m=+152.014962889" watchObservedRunningTime="2026-04-22 17:55:37.327978906 +0000 UTC m=+152.015439310" Apr 22 17:55:37.501090 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:37.501062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:37.501237 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:37.501197 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:55:37.501237 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:37.501217 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5544d4d987-rql9c: secret "image-registry-tls" not found Apr 22 17:55:37.501317 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:37.501275 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls podName:3e43894b-483d-49fb-a577-da6aec0956f0 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:41.501256915 +0000 UTC m=+156.188717301 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls") pod "image-registry-5544d4d987-rql9c" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0") : secret "image-registry-tls" not found Apr 22 17:55:39.203885 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.203848 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wx69w"] Apr 22 17:55:39.207017 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.207001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:39.209558 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.209537 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 17:55:39.209683 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.209663 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qhkjq\"" Apr 22 17:55:39.209757 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.209662 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 17:55:39.213214 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.213193 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wx69w"] Apr 22 17:55:39.311909 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.311881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:39.312022 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.311997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:39.347600 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.347580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hbjmn_95190e1c-03c4-4dcf-b739-7c181cb38f82/dns-node-resolver/0.log" Apr 22 17:55:39.412905 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.412883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:39.413007 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.412913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:39.413054 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:39.413020 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:55:39.413087 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:39.413075 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert podName:2f935fe0-30e5-4e30-8ea9-ea209b4859e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:39.913062341 +0000 UTC m=+154.600522727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wx69w" (UID: "2f935fe0-30e5-4e30-8ea9-ea209b4859e3") : secret "networking-console-plugin-cert" not found Apr 22 17:55:39.413541 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.413520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:39.917474 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:39.917444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:39.917663 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:39.917556 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:55:39.917663 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:39.917631 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert podName:2f935fe0-30e5-4e30-8ea9-ea209b4859e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:40.917614206 +0000 UTC m=+155.605074608 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wx69w" (UID: "2f935fe0-30e5-4e30-8ea9-ea209b4859e3") : secret "networking-console-plugin-cert" not found Apr 22 17:55:40.147101 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.147070 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nzqmg_a684f094-f2e9-4f18-b33b-e466f94313d8/node-ca/0.log" Apr 22 17:55:40.234216 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:40.234185 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5z57j" podUID="18c1f8e2-48fe-42d1-bfbc-436a196841e4" Apr 22 17:55:40.250492 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:40.250467 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-85lcr" podUID="0a435446-c735-4a7b-bbb9-eab6af3f7b77" Apr 22 17:55:40.293762 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.293737 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5z57j" Apr 22 17:55:40.448817 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.448786 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8mzr9"] Apr 22 17:55:40.451929 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.451907 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.454739 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.454721 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 17:55:40.454873 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.454810 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 17:55:40.455945 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.455928 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 17:55:40.456005 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.455964 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-djpzc\"" Apr 22 17:55:40.456005 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.455998 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 17:55:40.457894 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.457874 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8mzr9"] Apr 22 17:55:40.623620 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.623543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-signing-key\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.623620 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.623595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69wbf\" (UniqueName: \"kubernetes.io/projected/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-kube-api-access-69wbf\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.623790 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.623646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-signing-cabundle\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.724710 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.724676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-signing-key\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.724863 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.724735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69wbf\" (UniqueName: \"kubernetes.io/projected/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-kube-api-access-69wbf\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.724863 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.724761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-signing-cabundle\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.725509 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.725490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-signing-cabundle\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.727207 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.727186 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-signing-key\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.738382 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.738358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69wbf\" (UniqueName: \"kubernetes.io/projected/68bf3dbc-9ab0-4061-bad0-7de57c80fe66-kube-api-access-69wbf\") pod \"service-ca-865cb79987-8mzr9\" (UID: \"68bf3dbc-9ab0-4061-bad0-7de57c80fe66\") " pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.761227 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.761199 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-8mzr9" Apr 22 17:55:40.869989 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.869962 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8mzr9"] Apr 22 17:55:40.873279 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:40.873253 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68bf3dbc_9ab0_4061_bad0_7de57c80fe66.slice/crio-1a56d837254f4aee4c42cd35eee0ae4fd0d69e15a79ec11def3a052ecfef3bcd WatchSource:0}: Error finding container 1a56d837254f4aee4c42cd35eee0ae4fd0d69e15a79ec11def3a052ecfef3bcd: Status 404 returned error can't find the container with id 1a56d837254f4aee4c42cd35eee0ae4fd0d69e15a79ec11def3a052ecfef3bcd Apr 22 17:55:40.926428 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:40.926408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:40.926515 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:40.926504 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:55:40.926554 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:40.926550 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert podName:2f935fe0-30e5-4e30-8ea9-ea209b4859e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:42.926536121 +0000 UTC m=+157.613996505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wx69w" (UID: "2f935fe0-30e5-4e30-8ea9-ea209b4859e3") : secret "networking-console-plugin-cert" not found Apr 22 17:55:41.297148 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:41.297116 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-8mzr9" event={"ID":"68bf3dbc-9ab0-4061-bad0-7de57c80fe66","Type":"ContainerStarted","Data":"fed64b937d26146b3bffb1a7615c97150c6b6841342fbf885a23e0079bbc1ecd"} Apr 22 17:55:41.297148 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:41.297150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-8mzr9" event={"ID":"68bf3dbc-9ab0-4061-bad0-7de57c80fe66","Type":"ContainerStarted","Data":"1a56d837254f4aee4c42cd35eee0ae4fd0d69e15a79ec11def3a052ecfef3bcd"} Apr 22 17:55:41.316340 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:41.316286 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-8mzr9" podStartSLOduration=1.31626652 podStartE2EDuration="1.31626652s" podCreationTimestamp="2026-04-22 17:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:41.315627559 +0000 UTC m=+156.003087967" watchObservedRunningTime="2026-04-22 17:55:41.31626652 +0000 UTC m=+156.003726933" Apr 22 17:55:41.531254 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:41.531206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:41.531403 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:41.531348 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:55:41.531403 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:41.531368 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5544d4d987-rql9c: secret "image-registry-tls" not found Apr 22 17:55:41.531523 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:41.531422 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls podName:3e43894b-483d-49fb-a577-da6aec0956f0 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:49.531406313 +0000 UTC m=+164.218866698 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls") pod "image-registry-5544d4d987-rql9c" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0") : secret "image-registry-tls" not found Apr 22 17:55:41.866928 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:41.866888 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fztfm" podUID="027c3a56-b141-4f0e-beda-4bbc2fdc45c6" Apr 22 17:55:42.942031 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:42.941994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:42.942474 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:42.942165 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:55:42.942474 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:42.942252 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert podName:2f935fe0-30e5-4e30-8ea9-ea209b4859e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:46.942231202 +0000 UTC m=+161.629691605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wx69w" (UID: "2f935fe0-30e5-4e30-8ea9-ea209b4859e3") : secret "networking-console-plugin-cert" not found Apr 22 17:55:45.160689 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:45.160610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:55:45.160689 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:45.160669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:55:45.161103 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:45.160801 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:55:45.161103 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:45.160887 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert podName:0a435446-c735-4a7b-bbb9-eab6af3f7b77 nodeName:}" failed. No retries permitted until 2026-04-22 17:57:47.160868153 +0000 UTC m=+281.848328548 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert") pod "ingress-canary-85lcr" (UID: "0a435446-c735-4a7b-bbb9-eab6af3f7b77") : secret "canary-serving-cert" not found Apr 22 17:55:45.163019 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:45.162996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18c1f8e2-48fe-42d1-bfbc-436a196841e4-metrics-tls\") pod \"dns-default-5z57j\" (UID: \"18c1f8e2-48fe-42d1-bfbc-436a196841e4\") " pod="openshift-dns/dns-default-5z57j" Apr 22 17:55:45.397504 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:45.397471 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5nxd\"" Apr 22 17:55:45.405231 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:45.405209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5z57j" Apr 22 17:55:45.517210 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:45.517179 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5z57j"] Apr 22 17:55:45.520659 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:45.520628 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c1f8e2_48fe_42d1_bfbc_436a196841e4.slice/crio-4da546975c02769e2c3d2233d48798a6507efb82a8cdffd2bf9d4b5794ed57e2 WatchSource:0}: Error finding container 4da546975c02769e2c3d2233d48798a6507efb82a8cdffd2bf9d4b5794ed57e2: Status 404 returned error can't find the container with id 4da546975c02769e2c3d2233d48798a6507efb82a8cdffd2bf9d4b5794ed57e2 Apr 22 17:55:46.314065 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:46.314031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5z57j" event={"ID":"18c1f8e2-48fe-42d1-bfbc-436a196841e4","Type":"ContainerStarted","Data":"4da546975c02769e2c3d2233d48798a6507efb82a8cdffd2bf9d4b5794ed57e2"} Apr 22 17:55:46.978050 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:46.978006 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:46.978236 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:46.978219 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:55:46.978312 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:55:46.978294 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert podName:2f935fe0-30e5-4e30-8ea9-ea209b4859e3 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:54.978272397 +0000 UTC m=+169.665732785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wx69w" (UID: "2f935fe0-30e5-4e30-8ea9-ea209b4859e3") : secret "networking-console-plugin-cert" not found Apr 22 17:55:47.320454 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:47.320422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5z57j" event={"ID":"18c1f8e2-48fe-42d1-bfbc-436a196841e4","Type":"ContainerStarted","Data":"9853d44948365b4e0419d6936ca7437b96b764b9b9dfb8d92c806fea560f12b6"} Apr 22 17:55:47.320782 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:47.320461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5z57j" event={"ID":"18c1f8e2-48fe-42d1-bfbc-436a196841e4","Type":"ContainerStarted","Data":"7065c5e6e9c16c26bb573243b3dd605510e7ee618b16b278069c6bda503ce437"} Apr 22 17:55:47.320782 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:47.320555 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5z57j" Apr 22 17:55:47.343594 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:47.343553 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5z57j" podStartSLOduration=129.058237666 podStartE2EDuration="2m10.343540393s" podCreationTimestamp="2026-04-22 17:53:37 +0000 UTC" firstStartedPulling="2026-04-22 17:55:45.522441068 +0000 UTC m=+160.209901454" lastFinishedPulling="2026-04-22 17:55:46.807743796 +0000 UTC m=+161.495204181" observedRunningTime="2026-04-22 17:55:47.341886199 +0000 UTC m=+162.029346598" watchObservedRunningTime="2026-04-22 17:55:47.343540393 +0000 UTC m=+162.031000837" Apr 22 17:55:49.603077 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:49.603039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:49.605243 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:49.605220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"image-registry-5544d4d987-rql9c\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:49.675176 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:49.675153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:49.791555 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:49.791519 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5544d4d987-rql9c"] Apr 22 17:55:49.795413 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:49.795388 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e43894b_483d_49fb_a577_da6aec0956f0.slice/crio-09e8bbe6efae2b4b0797a798126eb088919e270b3833b22fb6e99c4878f5e8be WatchSource:0}: Error finding container 09e8bbe6efae2b4b0797a798126eb088919e270b3833b22fb6e99c4878f5e8be: Status 404 returned error can't find the container with id 09e8bbe6efae2b4b0797a798126eb088919e270b3833b22fb6e99c4878f5e8be Apr 22 17:55:50.330642 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:50.330603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" event={"ID":"3e43894b-483d-49fb-a577-da6aec0956f0","Type":"ContainerStarted","Data":"a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49"} Apr 22 17:55:50.330642 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:50.330641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" event={"ID":"3e43894b-483d-49fb-a577-da6aec0956f0","Type":"ContainerStarted","Data":"09e8bbe6efae2b4b0797a798126eb088919e270b3833b22fb6e99c4878f5e8be"} Apr 22 17:55:50.330905 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:50.330713 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:55:50.349725 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:50.349681 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" podStartSLOduration=17.349669464 podStartE2EDuration="17.349669464s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:50.348025831 +0000 UTC m=+165.035486249" watchObservedRunningTime="2026-04-22 17:55:50.349669464 +0000 UTC m=+165.037129871" Apr 22 17:55:52.851765 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:52.851734 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:55:55.046074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:55.046027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:55.048297 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:55.048276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f935fe0-30e5-4e30-8ea9-ea209b4859e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wx69w\" (UID: \"2f935fe0-30e5-4e30-8ea9-ea209b4859e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:55.115937 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:55.115899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" Apr 22 17:55:55.232234 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:55.232201 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wx69w"] Apr 22 17:55:55.234919 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:55:55.234885 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f935fe0_30e5_4e30_8ea9_ea209b4859e3.slice/crio-2d545374161287ad36465660cce01b46ee1351d4f53f5452f46fe9d20fddc374 WatchSource:0}: Error finding container 2d545374161287ad36465660cce01b46ee1351d4f53f5452f46fe9d20fddc374: Status 404 returned error can't find the container with id 2d545374161287ad36465660cce01b46ee1351d4f53f5452f46fe9d20fddc374 Apr 22 17:55:55.345430 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:55.345363 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" event={"ID":"2f935fe0-30e5-4e30-8ea9-ea209b4859e3","Type":"ContainerStarted","Data":"2d545374161287ad36465660cce01b46ee1351d4f53f5452f46fe9d20fddc374"} Apr 22 17:55:55.853604 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:55.853572 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:55:56.348556 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:56.348528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" event={"ID":"2f935fe0-30e5-4e30-8ea9-ea209b4859e3","Type":"ContainerStarted","Data":"6f2a16a5eebe093efab5ff7c0587763c9bec14cd9edd0a9f79a44b5c8dd194ff"} Apr 22 17:55:56.365166 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:56.365130 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wx69w" podStartSLOduration=16.436018995 podStartE2EDuration="17.365117536s" podCreationTimestamp="2026-04-22 17:55:39 +0000 UTC" firstStartedPulling="2026-04-22 17:55:55.236675213 +0000 UTC m=+169.924135601" lastFinishedPulling="2026-04-22 17:55:56.165773749 +0000 UTC m=+170.853234142" observedRunningTime="2026-04-22 17:55:56.36453661 +0000 UTC m=+171.051997017" watchObservedRunningTime="2026-04-22 17:55:56.365117536 +0000 UTC m=+171.052577942" Apr 22 17:55:57.325355 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:55:57.325329 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5z57j" Apr 22 17:56:03.130697 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.130668 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5544d4d987-rql9c"] Apr 22 17:56:03.154286 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.154255 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6w9nf"] Apr 22 17:56:03.157076 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.157060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.159692 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.159669 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:56:03.159798 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.159701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:56:03.159867 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.159824 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4dkvz\"" Apr 22 17:56:03.160109 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.160085 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:56:03.160245 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.160229 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:56:03.172677 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.172657 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8688656c7d-nlp4j"] Apr 22 17:56:03.174398 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.174385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.182342 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.182323 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6w9nf"] Apr 22 17:56:03.196379 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196353 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f7c8463-f514-4135-ac39-19258a51ead6-data-volume\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.196483 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196386 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d5e5c18-bc99-45de-811b-fd57940d36f8-registry-certificates\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.196483 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d5e5c18-bc99-45de-811b-fd57940d36f8-ca-trust-extracted\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.196563 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmf44\" (UniqueName: \"kubernetes.io/projected/0f7c8463-f514-4135-ac39-19258a51ead6-kube-api-access-mmf44\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.196563 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196505 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d5e5c18-bc99-45de-811b-fd57940d36f8-installation-pull-secrets\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.196676 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196562 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f7c8463-f514-4135-ac39-19258a51ead6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.196676 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-bound-sa-token\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.196676 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196604 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-registry-tls\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.196676 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1d5e5c18-bc99-45de-811b-fd57940d36f8-image-registry-private-configuration\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.196676 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f7c8463-f514-4135-ac39-19258a51ead6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.196893 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f7c8463-f514-4135-ac39-19258a51ead6-crio-socket\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.196893 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmdfw\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-kube-api-access-kmdfw\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.196893 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.196783 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e5c18-bc99-45de-811b-fd57940d36f8-trusted-ca\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.198379 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.198361 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8688656c7d-nlp4j"] Apr 22 17:56:03.298018 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.297983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f7c8463-f514-4135-ac39-19258a51ead6-data-volume\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.298018 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d5e5c18-bc99-45de-811b-fd57940d36f8-registry-certificates\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298254 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d5e5c18-bc99-45de-811b-fd57940d36f8-ca-trust-extracted\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298254 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmf44\" (UniqueName: \"kubernetes.io/projected/0f7c8463-f514-4135-ac39-19258a51ead6-kube-api-access-mmf44\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.298254 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d5e5c18-bc99-45de-811b-fd57940d36f8-installation-pull-secrets\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298254 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f7c8463-f514-4135-ac39-19258a51ead6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-bound-sa-token\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-registry-tls\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1d5e5c18-bc99-45de-811b-fd57940d36f8-image-registry-private-configuration\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f7c8463-f514-4135-ac39-19258a51ead6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f7c8463-f514-4135-ac39-19258a51ead6-crio-socket\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmdfw\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-kube-api-access-kmdfw\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d5e5c18-bc99-45de-811b-fd57940d36f8-ca-trust-extracted\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298464 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e5c18-bc99-45de-811b-fd57940d36f8-trusted-ca\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.298828 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f7c8463-f514-4135-ac39-19258a51ead6-crio-socket\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.299012 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.298987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f7c8463-f514-4135-ac39-19258a51ead6-data-volume\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.299378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.299352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f7c8463-f514-4135-ac39-19258a51ead6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.299580 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.299562 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d5e5c18-bc99-45de-811b-fd57940d36f8-registry-certificates\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.299886 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.299865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e5c18-bc99-45de-811b-fd57940d36f8-trusted-ca\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.301069 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.301046 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1d5e5c18-bc99-45de-811b-fd57940d36f8-image-registry-private-configuration\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.301158 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.301089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-registry-tls\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.301158 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.301092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f7c8463-f514-4135-ac39-19258a51ead6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.301252 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.301230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d5e5c18-bc99-45de-811b-fd57940d36f8-installation-pull-secrets\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.308088 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.308066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmf44\" (UniqueName: \"kubernetes.io/projected/0f7c8463-f514-4135-ac39-19258a51ead6-kube-api-access-mmf44\") pod \"insights-runtime-extractor-6w9nf\" (UID: \"0f7c8463-f514-4135-ac39-19258a51ead6\") " pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.319996 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.319974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmdfw\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-kube-api-access-kmdfw\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.322882 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.322860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d5e5c18-bc99-45de-811b-fd57940d36f8-bound-sa-token\") pod \"image-registry-8688656c7d-nlp4j\" (UID: \"1d5e5c18-bc99-45de-811b-fd57940d36f8\") " pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.465485 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.465405 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6w9nf" Apr 22 17:56:03.482686 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.482663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:03.605164 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.605137 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6w9nf"] Apr 22 17:56:03.608427 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:03.608399 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f7c8463_f514_4135_ac39_19258a51ead6.slice/crio-d61f7d110ed098681a096fdeecf18c54d24cb8049a5ceedbb617647cb31582de WatchSource:0}: Error finding container d61f7d110ed098681a096fdeecf18c54d24cb8049a5ceedbb617647cb31582de: Status 404 returned error can't find the container with id d61f7d110ed098681a096fdeecf18c54d24cb8049a5ceedbb617647cb31582de Apr 22 17:56:03.630561 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:03.630536 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8688656c7d-nlp4j"] Apr 22 17:56:03.637868 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:03.637819 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d5e5c18_bc99_45de_811b_fd57940d36f8.slice/crio-715d278310df14d898eef0715b50af807e9f560df62af36da6f66cd6fcb697e0 WatchSource:0}: Error finding container 715d278310df14d898eef0715b50af807e9f560df62af36da6f66cd6fcb697e0: Status 404 returned error can't find the container with id 715d278310df14d898eef0715b50af807e9f560df62af36da6f66cd6fcb697e0 Apr 22 17:56:04.372005 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:04.371961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" event={"ID":"1d5e5c18-bc99-45de-811b-fd57940d36f8","Type":"ContainerStarted","Data":"be1f48786d0d429512afff262b07774930e55eeb9aa3eb5414456fcf9e2515c2"} Apr 22 17:56:04.372005 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:04.372003 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" event={"ID":"1d5e5c18-bc99-45de-811b-fd57940d36f8","Type":"ContainerStarted","Data":"715d278310df14d898eef0715b50af807e9f560df62af36da6f66cd6fcb697e0"} Apr 22 17:56:04.372462 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:04.372127 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:04.373316 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:04.373292 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w9nf" event={"ID":"0f7c8463-f514-4135-ac39-19258a51ead6","Type":"ContainerStarted","Data":"3ac40f940b0623024dc93a7a951307d433fdc9aabd7ba3860e778a4ac73a18e4"} Apr 22 17:56:04.373434 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:04.373320 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w9nf" event={"ID":"0f7c8463-f514-4135-ac39-19258a51ead6","Type":"ContainerStarted","Data":"d61f7d110ed098681a096fdeecf18c54d24cb8049a5ceedbb617647cb31582de"} Apr 22 17:56:04.394011 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:04.393967 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" podStartSLOduration=1.393953458 podStartE2EDuration="1.393953458s" podCreationTimestamp="2026-04-22 17:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:04.393654346 +0000 UTC m=+179.081114753" watchObservedRunningTime="2026-04-22 17:56:04.393953458 +0000 UTC m=+179.081413864" Apr 22 17:56:05.377729 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:05.377692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w9nf" event={"ID":"0f7c8463-f514-4135-ac39-19258a51ead6","Type":"ContainerStarted","Data":"b36c4801b91aa4bc666eb162348c12590273fe943f73a01492b7230b1f35b7b1"} Apr 22 17:56:06.381904 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:06.381867 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w9nf" event={"ID":"0f7c8463-f514-4135-ac39-19258a51ead6","Type":"ContainerStarted","Data":"73c5a48bdd99491f3ea6105776e77aef49c0949de3f4b959fd6b4679e97dbd74"} Apr 22 17:56:06.410996 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:06.410949 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6w9nf" podStartSLOduration=1.344676251 podStartE2EDuration="3.410936538s" podCreationTimestamp="2026-04-22 17:56:03 +0000 UTC" firstStartedPulling="2026-04-22 17:56:03.668739464 +0000 UTC m=+178.356199849" lastFinishedPulling="2026-04-22 17:56:05.734999748 +0000 UTC m=+180.422460136" observedRunningTime="2026-04-22 17:56:06.409993778 +0000 UTC m=+181.097454186" watchObservedRunningTime="2026-04-22 17:56:06.410936538 +0000 UTC m=+181.098396945" Apr 22 17:56:10.860385 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.860351 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-4vggv"] Apr 22 17:56:10.863329 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.863312 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:10.867238 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.867175 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:56:10.867238 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.867196 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:56:10.867238 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.867177 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qkrpz\"" Apr 22 17:56:10.867238 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.867230 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 17:56:10.867497 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.867217 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 17:56:10.867497 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.867228 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:56:10.871284 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.871228 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-4vggv"] Apr 22 17:56:10.959042 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.959021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7e4f65e-1162-4107-9677-62f7613ed0f5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:10.959170 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.959087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7e4f65e-1162-4107-9677-62f7613ed0f5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:10.959170 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.959157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p779v\" (UniqueName: \"kubernetes.io/projected/a7e4f65e-1162-4107-9677-62f7613ed0f5-kube-api-access-p779v\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:10.959288 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:10.959203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7e4f65e-1162-4107-9677-62f7613ed0f5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.059469 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.059443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7e4f65e-1162-4107-9677-62f7613ed0f5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.059583 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.059491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7e4f65e-1162-4107-9677-62f7613ed0f5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.059583 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.059524 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p779v\" (UniqueName: \"kubernetes.io/projected/a7e4f65e-1162-4107-9677-62f7613ed0f5-kube-api-access-p779v\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.059583 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.059548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7e4f65e-1162-4107-9677-62f7613ed0f5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.060242 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.060222 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7e4f65e-1162-4107-9677-62f7613ed0f5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.061941 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.061922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7e4f65e-1162-4107-9677-62f7613ed0f5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.062010 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.061988 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7e4f65e-1162-4107-9677-62f7613ed0f5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.067584 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.067563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p779v\" (UniqueName: \"kubernetes.io/projected/a7e4f65e-1162-4107-9677-62f7613ed0f5-kube-api-access-p779v\") pod \"prometheus-operator-5676c8c784-4vggv\" (UID: \"a7e4f65e-1162-4107-9677-62f7613ed0f5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.172817 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.172767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" Apr 22 17:56:11.286686 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.286506 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-4vggv"] Apr 22 17:56:11.289290 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:11.289262 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e4f65e_1162_4107_9677_62f7613ed0f5.slice/crio-8b81223bfb4a8c5c798e2bc6eb8b12f297b3a393af056a53fd5a7f03a06f1fce WatchSource:0}: Error finding container 8b81223bfb4a8c5c798e2bc6eb8b12f297b3a393af056a53fd5a7f03a06f1fce: Status 404 returned error can't find the container with id 8b81223bfb4a8c5c798e2bc6eb8b12f297b3a393af056a53fd5a7f03a06f1fce Apr 22 17:56:11.395640 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:11.395610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" event={"ID":"a7e4f65e-1162-4107-9677-62f7613ed0f5","Type":"ContainerStarted","Data":"8b81223bfb4a8c5c798e2bc6eb8b12f297b3a393af056a53fd5a7f03a06f1fce"} Apr 22 17:56:13.135664 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:13.135634 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:56:13.404089 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:13.404017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" event={"ID":"a7e4f65e-1162-4107-9677-62f7613ed0f5","Type":"ContainerStarted","Data":"6fcbbdd42d9e5e0893305e0afaec3c3f701e4bffe446b7ac74f670fce9f0d527"} Apr 22 17:56:13.404089 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:13.404049 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" event={"ID":"a7e4f65e-1162-4107-9677-62f7613ed0f5","Type":"ContainerStarted","Data":"a9730134fc2b359ba41268edf6fe861e05233d5e6962195c81d913e3bc11bf2e"} Apr 22 17:56:13.420864 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:13.420792 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-4vggv" podStartSLOduration=2.235398887 podStartE2EDuration="3.42077818s" podCreationTimestamp="2026-04-22 17:56:10 +0000 UTC" firstStartedPulling="2026-04-22 17:56:11.291001528 +0000 UTC m=+185.978461913" lastFinishedPulling="2026-04-22 17:56:12.476380821 +0000 UTC m=+187.163841206" observedRunningTime="2026-04-22 17:56:13.419985524 +0000 UTC m=+188.107445931" watchObservedRunningTime="2026-04-22 17:56:13.42077818 +0000 UTC m=+188.108238587" Apr 22 17:56:15.225073 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.225044 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-l9d29"] Apr 22 17:56:15.228600 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.228579 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.231163 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.231133 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:56:15.231350 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.231328 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:56:15.231562 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.231333 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lp8xv\"" Apr 22 17:56:15.231786 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.231771 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:56:15.240491 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.240472 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qqshn"] Apr 22 17:56:15.243723 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.243707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.246215 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.246192 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 17:56:15.246306 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.246258 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:56:15.246306 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.246271 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 17:56:15.246406 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.246263 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-vpw5g\"" Apr 22 17:56:15.258197 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.258177 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qqshn"] Apr 22 17:56:15.291274 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.291384 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-sys\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291384 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-tls\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291384 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291357 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.291543 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c89d511-a6e3-4823-b018-ec96f670c05c-metrics-client-ca\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291543 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-accelerators-collector-config\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291543 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.291543 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291493 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-textfile\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291543 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd5wz\" (UniqueName: \"kubernetes.io/projected/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-api-access-vd5wz\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.291780 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.291780 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.291780 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4ln\" (UniqueName: \"kubernetes.io/projected/7c89d511-a6e3-4823-b018-ec96f670c05c-kube-api-access-hs4ln\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291780 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-root\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291780 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-wtmp\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.291780 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.291712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392570 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.392718 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.392718 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4ln\" (UniqueName: \"kubernetes.io/projected/7c89d511-a6e3-4823-b018-ec96f670c05c-kube-api-access-hs4ln\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392718 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-root\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392718 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-wtmp\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392718 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-root\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-sys\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-wtmp\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-tls\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392870 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c89d511-a6e3-4823-b018-ec96f670c05c-sys\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c89d511-a6e3-4823-b018-ec96f670c05c-metrics-client-ca\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-accelerators-collector-config\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:56:15.392964 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392967 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.392977 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.392984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-textfile\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.393570 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:56:15.393023 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-tls podName:7c89d511-a6e3-4823-b018-ec96f670c05c nodeName:}" failed. No retries permitted until 2026-04-22 17:56:15.893001752 +0000 UTC m=+190.580462144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-tls") pod "node-exporter-l9d29" (UID: "7c89d511-a6e3-4823-b018-ec96f670c05c") : secret "node-exporter-tls" not found Apr 22 17:56:15.393570 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.393057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd5wz\" (UniqueName: \"kubernetes.io/projected/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-api-access-vd5wz\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.393570 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.393260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-textfile\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.393570 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.393409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.393770 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:56:15.393672 2574 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 17:56:15.393770 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:56:15.393719 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-tls podName:a4cdefe3-79ae-40ff-95ad-5f7ed643723e nodeName:}" failed. No retries permitted until 2026-04-22 17:56:15.893703208 +0000 UTC m=+190.581163607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-qqshn" (UID: "a4cdefe3-79ae-40ff-95ad-5f7ed643723e") : secret "kube-state-metrics-tls" not found Apr 22 17:56:15.393887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.393822 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-accelerators-collector-config\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.393887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.393860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c89d511-a6e3-4823-b018-ec96f670c05c-metrics-client-ca\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.394327 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.394308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.395631 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.395611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.396417 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.396399 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.402034 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.402015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd5wz\" (UniqueName: \"kubernetes.io/projected/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-api-access-vd5wz\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.402746 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.402730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4ln\" (UniqueName: \"kubernetes.io/projected/7c89d511-a6e3-4823-b018-ec96f670c05c-kube-api-access-hs4ln\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.898352 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.898316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:15.898518 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.898367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-tls\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.900637 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.900612 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c89d511-a6e3-4823-b018-ec96f670c05c-node-exporter-tls\") pod \"node-exporter-l9d29\" (UID: \"7c89d511-a6e3-4823-b018-ec96f670c05c\") " pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:15.900735 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:15.900667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4cdefe3-79ae-40ff-95ad-5f7ed643723e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqshn\" (UID: \"a4cdefe3-79ae-40ff-95ad-5f7ed643723e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:16.137565 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:16.137541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l9d29" Apr 22 17:56:16.145295 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:16.145262 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c89d511_a6e3_4823_b018_ec96f670c05c.slice/crio-8f0e407bbbb9dc0d730c7b63a384007d575ff44f1df822935e0fac004c37afde WatchSource:0}: Error finding container 8f0e407bbbb9dc0d730c7b63a384007d575ff44f1df822935e0fac004c37afde: Status 404 returned error can't find the container with id 8f0e407bbbb9dc0d730c7b63a384007d575ff44f1df822935e0fac004c37afde Apr 22 17:56:16.152467 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:16.152451 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" Apr 22 17:56:16.300512 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:16.300443 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qqshn"] Apr 22 17:56:16.304084 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:16.304053 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4cdefe3_79ae_40ff_95ad_5f7ed643723e.slice/crio-4aaf7a305a2564149ecc45fabc31f359d295fb304b22a74216e4b999d77d103c WatchSource:0}: Error finding container 4aaf7a305a2564149ecc45fabc31f359d295fb304b22a74216e4b999d77d103c: Status 404 returned error can't find the container with id 4aaf7a305a2564149ecc45fabc31f359d295fb304b22a74216e4b999d77d103c Apr 22 17:56:16.426743 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:16.426660 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l9d29" event={"ID":"7c89d511-a6e3-4823-b018-ec96f670c05c","Type":"ContainerStarted","Data":"8f0e407bbbb9dc0d730c7b63a384007d575ff44f1df822935e0fac004c37afde"} Apr 22 17:56:16.427773 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:16.427745 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" event={"ID":"a4cdefe3-79ae-40ff-95ad-5f7ed643723e","Type":"ContainerStarted","Data":"4aaf7a305a2564149ecc45fabc31f359d295fb304b22a74216e4b999d77d103c"} Apr 22 17:56:17.431876 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:17.431824 2574 generic.go:358] "Generic (PLEG): container finished" podID="7c89d511-a6e3-4823-b018-ec96f670c05c" containerID="5cc3068e6d5e31a1f33faa01d0703b158c14da1dd1838dfa28c460042ecd8d3c" exitCode=0 Apr 22 17:56:17.432306 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:17.431930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l9d29" event={"ID":"7c89d511-a6e3-4823-b018-ec96f670c05c","Type":"ContainerDied","Data":"5cc3068e6d5e31a1f33faa01d0703b158c14da1dd1838dfa28c460042ecd8d3c"} Apr 22 17:56:18.436410 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:18.436380 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" event={"ID":"a4cdefe3-79ae-40ff-95ad-5f7ed643723e","Type":"ContainerStarted","Data":"cdccac9cf1b173b7a83878be4da2694e07e97aa2650fcd2ce1e19950d802b7b9"} Apr 22 17:56:18.436909 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:18.436416 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" event={"ID":"a4cdefe3-79ae-40ff-95ad-5f7ed643723e","Type":"ContainerStarted","Data":"0a0e399dcb997bddaf30ac500fc72a1921eae16986516df850495e17aca71b33"} Apr 22 17:56:18.436909 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:18.436434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" event={"ID":"a4cdefe3-79ae-40ff-95ad-5f7ed643723e","Type":"ContainerStarted","Data":"6bdd9da79962f934541d21e07d3fb3821f4dd08df295cbe7aa24e065d36c26b8"} Apr 22 17:56:18.438285 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:18.438262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l9d29" event={"ID":"7c89d511-a6e3-4823-b018-ec96f670c05c","Type":"ContainerStarted","Data":"181cc7ad23a1a13fd23ad60844611659ba0a8fe093fdf32323147e2b7023130a"} Apr 22 17:56:18.438406 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:18.438289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l9d29" event={"ID":"7c89d511-a6e3-4823-b018-ec96f670c05c","Type":"ContainerStarted","Data":"4f3eadcf6400288d8ffe8ef1242d5619cebd5ccafa80e7455677c9b2c43c0cc4"} Apr 22 17:56:18.460001 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:18.459962 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqshn" podStartSLOduration=2.086001504 podStartE2EDuration="3.45995052s" podCreationTimestamp="2026-04-22 17:56:15 +0000 UTC" firstStartedPulling="2026-04-22 17:56:16.305874946 +0000 UTC m=+190.993335332" lastFinishedPulling="2026-04-22 17:56:17.67982396 +0000 UTC m=+192.367284348" observedRunningTime="2026-04-22 17:56:18.458859708 +0000 UTC m=+193.146320112" watchObservedRunningTime="2026-04-22 17:56:18.45995052 +0000 UTC m=+193.147410927" Apr 22 17:56:18.479618 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:18.479573 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-l9d29" podStartSLOduration=2.752940847 podStartE2EDuration="3.479557674s" podCreationTimestamp="2026-04-22 17:56:15 +0000 UTC" firstStartedPulling="2026-04-22 17:56:16.146978733 +0000 UTC m=+190.834439118" lastFinishedPulling="2026-04-22 17:56:16.873595549 +0000 UTC m=+191.561055945" observedRunningTime="2026-04-22 17:56:18.478455299 +0000 UTC m=+193.165915707" watchObservedRunningTime="2026-04-22 17:56:18.479557674 +0000 UTC m=+193.167018082" Apr 22 17:56:19.723110 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.723078 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7d7d964cb8-z9699"] Apr 22 17:56:19.729425 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.729392 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.733411 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.733385 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 17:56:19.733540 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.733457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-ptm5f\"" Apr 22 17:56:19.733540 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.733506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6lc99o09ski32\"" Apr 22 17:56:19.733667 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.733386 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 17:56:19.733667 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.733395 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 17:56:19.733667 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.733547 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:56:19.736649 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.736628 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d7d964cb8-z9699"] Apr 22 17:56:19.832560 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.832526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/114c0cb9-c1db-4db0-b0fc-5585476a4502-audit-log\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.832560 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.832562 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/114c0cb9-c1db-4db0-b0fc-5585476a4502-metrics-server-audit-profiles\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.832767 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.832581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-client-ca-bundle\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.832767 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.832680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114c0cb9-c1db-4db0-b0fc-5585476a4502-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.832767 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.832722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-secret-metrics-server-tls\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.832909 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.832770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ljt\" (UniqueName: \"kubernetes.io/projected/114c0cb9-c1db-4db0-b0fc-5585476a4502-kube-api-access-m4ljt\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.832909 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.832817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-secret-metrics-server-client-certs\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.933764 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.933726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114c0cb9-c1db-4db0-b0fc-5585476a4502-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.933973 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.933862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-secret-metrics-server-tls\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.933973 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.933921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ljt\" (UniqueName: \"kubernetes.io/projected/114c0cb9-c1db-4db0-b0fc-5585476a4502-kube-api-access-m4ljt\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.933973 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.933963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-secret-metrics-server-client-certs\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.934136 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.933995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/114c0cb9-c1db-4db0-b0fc-5585476a4502-audit-log\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.934136 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.934020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/114c0cb9-c1db-4db0-b0fc-5585476a4502-metrics-server-audit-profiles\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.934136 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.934072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-client-ca-bundle\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.934560 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.934535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/114c0cb9-c1db-4db0-b0fc-5585476a4502-audit-log\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.934655 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.934564 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/114c0cb9-c1db-4db0-b0fc-5585476a4502-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.934999 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.934973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/114c0cb9-c1db-4db0-b0fc-5585476a4502-metrics-server-audit-profiles\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.936957 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.936934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-client-ca-bundle\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.938111 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.937461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-secret-metrics-server-client-certs\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.938482 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.938462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/114c0cb9-c1db-4db0-b0fc-5585476a4502-secret-metrics-server-tls\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:19.947192 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:19.947163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ljt\" (UniqueName: \"kubernetes.io/projected/114c0cb9-c1db-4db0-b0fc-5585476a4502-kube-api-access-m4ljt\") pod \"metrics-server-7d7d964cb8-z9699\" (UID: \"114c0cb9-c1db-4db0-b0fc-5585476a4502\") " pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:20.039689 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:20.039590 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:20.168642 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:20.168586 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d7d964cb8-z9699"] Apr 22 17:56:20.172657 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:20.172621 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114c0cb9_c1db_4db0_b0fc_5585476a4502.slice/crio-7bacf72c52a6d586b72c5f47d211923ecfbf5c913581480edeb190ca3918fe42 WatchSource:0}: Error finding container 7bacf72c52a6d586b72c5f47d211923ecfbf5c913581480edeb190ca3918fe42: Status 404 returned error can't find the container with id 7bacf72c52a6d586b72c5f47d211923ecfbf5c913581480edeb190ca3918fe42 Apr 22 17:56:20.444679 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:20.444600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" event={"ID":"114c0cb9-c1db-4db0-b0fc-5585476a4502","Type":"ContainerStarted","Data":"7bacf72c52a6d586b72c5f47d211923ecfbf5c913581480edeb190ca3918fe42"} Apr 22 17:56:22.452728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:22.452696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" event={"ID":"114c0cb9-c1db-4db0-b0fc-5585476a4502","Type":"ContainerStarted","Data":"8463c2e29c2b70e1249eee41f7a01a36f43b191a48c6c43ba61afaeef8dc0708"} Apr 22 17:56:22.474376 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:22.474310 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" podStartSLOduration=1.888809865 podStartE2EDuration="3.474292442s" podCreationTimestamp="2026-04-22 17:56:19 +0000 UTC" firstStartedPulling="2026-04-22 17:56:20.174566507 +0000 UTC m=+194.862026892" lastFinishedPulling="2026-04-22 17:56:21.76004908 +0000 UTC m=+196.447509469" observedRunningTime="2026-04-22 17:56:22.472967551 +0000 UTC m=+197.160427957" watchObservedRunningTime="2026-04-22 17:56:22.474292442 +0000 UTC m=+197.161752849" Apr 22 17:56:25.382406 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:25.382373 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8688656c7d-nlp4j" Apr 22 17:56:28.149148 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.149109 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" podUID="3e43894b-483d-49fb-a577-da6aec0956f0" containerName="registry" containerID="cri-o://a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49" gracePeriod=30 Apr 22 17:56:28.386682 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.386658 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:56:28.471781 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.471752 2574 generic.go:358] "Generic (PLEG): container finished" podID="3e43894b-483d-49fb-a577-da6aec0956f0" containerID="a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49" exitCode=0 Apr 22 17:56:28.471944 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.471792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" event={"ID":"3e43894b-483d-49fb-a577-da6aec0956f0","Type":"ContainerDied","Data":"a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49"} Apr 22 17:56:28.471944 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.471813 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" event={"ID":"3e43894b-483d-49fb-a577-da6aec0956f0","Type":"ContainerDied","Data":"09e8bbe6efae2b4b0797a798126eb088919e270b3833b22fb6e99c4878f5e8be"} Apr 22 17:56:28.471944 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.471828 2574 scope.go:117] "RemoveContainer" containerID="a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49" Apr 22 17:56:28.471944 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.471854 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5544d4d987-rql9c" Apr 22 17:56:28.479320 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.479291 2574 scope.go:117] "RemoveContainer" containerID="a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49" Apr 22 17:56:28.479588 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:56:28.479564 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49\": container with ID starting with a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49 not found: ID does not exist" containerID="a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49" Apr 22 17:56:28.479657 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.479600 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49"} err="failed to get container status \"a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49\": rpc error: code = NotFound desc = could not find container \"a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49\": container with ID starting with a3d8675202613d7f3f48b805bcf1aec1f08ae0008b0f48467756d6048288ec49 not found: ID does not exist" Apr 22 17:56:28.510941 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.510908 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-registry-certificates\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511092 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.510949 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npktp\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-kube-api-access-npktp\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511092 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.510995 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-installation-pull-secrets\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511092 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511027 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-bound-sa-token\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511092 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511058 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-trusted-ca\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511092 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511081 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511343 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511167 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e43894b-483d-49fb-a577-da6aec0956f0-ca-trust-extracted\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511343 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511243 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-image-registry-private-configuration\") pod \"3e43894b-483d-49fb-a577-da6aec0956f0\" (UID: \"3e43894b-483d-49fb-a577-da6aec0956f0\") " Apr 22 17:56:28.511479 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511453 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:56:28.511572 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511546 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:56:28.511627 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.511556 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-registry-certificates\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.513433 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.513406 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:56:28.513545 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.513429 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-kube-api-access-npktp" (OuterVolumeSpecName: "kube-api-access-npktp") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "kube-api-access-npktp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:56:28.513673 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.513649 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:56:28.513673 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.513661 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:56:28.513796 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.513782 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:56:28.521774 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.521745 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e43894b-483d-49fb-a577-da6aec0956f0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3e43894b-483d-49fb-a577-da6aec0956f0" (UID: "3e43894b-483d-49fb-a577-da6aec0956f0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:56:28.612705 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.612653 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-installation-pull-secrets\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.612705 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.612697 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-bound-sa-token\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.612705 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.612707 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e43894b-483d-49fb-a577-da6aec0956f0-trusted-ca\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.612705 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.612718 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-registry-tls\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.613003 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.612727 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e43894b-483d-49fb-a577-da6aec0956f0-ca-trust-extracted\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.613003 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.612737 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e43894b-483d-49fb-a577-da6aec0956f0-image-registry-private-configuration\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.613003 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.612746 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-npktp\" (UniqueName: \"kubernetes.io/projected/3e43894b-483d-49fb-a577-da6aec0956f0-kube-api-access-npktp\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.792850 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.792812 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5544d4d987-rql9c"] Apr 22 17:56:28.797061 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:28.797009 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5544d4d987-rql9c"] Apr 22 17:56:29.855267 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:29.855234 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e43894b-483d-49fb-a577-da6aec0956f0" path="/var/lib/kubelet/pods/3e43894b-483d-49fb-a577-da6aec0956f0/volumes" Apr 22 17:56:33.353517 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.353486 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-vsfb6"] Apr 22 17:56:33.353907 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.353750 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e43894b-483d-49fb-a577-da6aec0956f0" containerName="registry" Apr 22 17:56:33.353907 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.353760 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e43894b-483d-49fb-a577-da6aec0956f0" containerName="registry" Apr 22 17:56:33.353907 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.353812 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e43894b-483d-49fb-a577-da6aec0956f0" containerName="registry" Apr 22 17:56:33.357229 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.357212 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-vsfb6" Apr 22 17:56:33.362399 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.362282 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:56:33.362399 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.362303 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:56:33.362399 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.362318 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-n5ttj\"" Apr 22 17:56:33.386693 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.386667 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-vsfb6"] Apr 22 17:56:33.451014 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.450979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqcwp\" (UniqueName: \"kubernetes.io/projected/2c1e09a8-f37d-488d-875f-501a78bdc7b1-kube-api-access-tqcwp\") pod \"downloads-6bcc868b7-vsfb6\" (UID: \"2c1e09a8-f37d-488d-875f-501a78bdc7b1\") " pod="openshift-console/downloads-6bcc868b7-vsfb6" Apr 22 17:56:33.551892 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.551857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqcwp\" (UniqueName: \"kubernetes.io/projected/2c1e09a8-f37d-488d-875f-501a78bdc7b1-kube-api-access-tqcwp\") pod \"downloads-6bcc868b7-vsfb6\" (UID: \"2c1e09a8-f37d-488d-875f-501a78bdc7b1\") " pod="openshift-console/downloads-6bcc868b7-vsfb6" Apr 22 17:56:33.560097 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.560066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqcwp\" (UniqueName: \"kubernetes.io/projected/2c1e09a8-f37d-488d-875f-501a78bdc7b1-kube-api-access-tqcwp\") pod \"downloads-6bcc868b7-vsfb6\" (UID: \"2c1e09a8-f37d-488d-875f-501a78bdc7b1\") " pod="openshift-console/downloads-6bcc868b7-vsfb6" Apr 22 17:56:33.666072 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.665979 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-vsfb6" Apr 22 17:56:33.788258 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:33.788229 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-vsfb6"] Apr 22 17:56:33.791813 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:33.791783 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1e09a8_f37d_488d_875f_501a78bdc7b1.slice/crio-6032240d54a0092e49643a9a44290098934a6fb117c172e485ff56ed3becfc03 WatchSource:0}: Error finding container 6032240d54a0092e49643a9a44290098934a6fb117c172e485ff56ed3becfc03: Status 404 returned error can't find the container with id 6032240d54a0092e49643a9a44290098934a6fb117c172e485ff56ed3becfc03 Apr 22 17:56:34.490458 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:34.490406 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-vsfb6" event={"ID":"2c1e09a8-f37d-488d-875f-501a78bdc7b1","Type":"ContainerStarted","Data":"6032240d54a0092e49643a9a44290098934a6fb117c172e485ff56ed3becfc03"} Apr 22 17:56:38.926378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.926340 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5766fb98dc-7hgzs"] Apr 22 17:56:38.938232 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.938194 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5766fb98dc-7hgzs"] Apr 22 17:56:38.938401 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.938321 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:38.942887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.942476 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:56:38.942887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.942741 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:56:38.942887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.942741 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:56:38.942887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.942756 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-dvf9d\"" Apr 22 17:56:38.942887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.942825 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:56:38.942887 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:38.942826 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:56:39.103009 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.102975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-oauth-config\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.103202 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.103046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-serving-cert\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.103202 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.103106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-service-ca\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.103202 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.103180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-console-config\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.103382 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.103218 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlrb7\" (UniqueName: \"kubernetes.io/projected/b649836c-a791-440c-bca3-15943a388a52-kube-api-access-xlrb7\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.103382 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.103254 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-oauth-serving-cert\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.204041 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.203958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlrb7\" (UniqueName: \"kubernetes.io/projected/b649836c-a791-440c-bca3-15943a388a52-kube-api-access-xlrb7\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.204041 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-oauth-serving-cert\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.204274 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-oauth-config\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.204274 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-serving-cert\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.204274 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-service-ca\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.204274 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-console-config\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.204871 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-oauth-serving-cert\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.205000 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-service-ca\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.205074 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.204974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-console-config\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.206952 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.206928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-serving-cert\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.207068 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.207008 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-oauth-config\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.213078 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.213058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlrb7\" (UniqueName: \"kubernetes.io/projected/b649836c-a791-440c-bca3-15943a388a52-kube-api-access-xlrb7\") pod \"console-5766fb98dc-7hgzs\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.251024 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.250983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:39.388147 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.388114 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5766fb98dc-7hgzs"] Apr 22 17:56:39.393189 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:39.393150 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb649836c_a791_440c_bca3_15943a388a52.slice/crio-bb13809b9177e95b823b9e7827ade70f1e4afc541b64f1fc063e5b654a27be90 WatchSource:0}: Error finding container bb13809b9177e95b823b9e7827ade70f1e4afc541b64f1fc063e5b654a27be90: Status 404 returned error can't find the container with id bb13809b9177e95b823b9e7827ade70f1e4afc541b64f1fc063e5b654a27be90 Apr 22 17:56:39.507016 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:39.506974 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5766fb98dc-7hgzs" event={"ID":"b649836c-a791-440c-bca3-15943a388a52","Type":"ContainerStarted","Data":"bb13809b9177e95b823b9e7827ade70f1e4afc541b64f1fc063e5b654a27be90"} Apr 22 17:56:40.040276 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:40.040233 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:40.040726 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:40.040289 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:56:42.518007 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:42.517924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5766fb98dc-7hgzs" event={"ID":"b649836c-a791-440c-bca3-15943a388a52","Type":"ContainerStarted","Data":"369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1"} Apr 22 17:56:42.539806 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:42.539761 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5766fb98dc-7hgzs" podStartSLOduration=1.677679342 podStartE2EDuration="4.539747041s" podCreationTimestamp="2026-04-22 17:56:38 +0000 UTC" firstStartedPulling="2026-04-22 17:56:39.395627441 +0000 UTC m=+214.083087826" lastFinishedPulling="2026-04-22 17:56:42.257695127 +0000 UTC m=+216.945155525" observedRunningTime="2026-04-22 17:56:42.537319387 +0000 UTC m=+217.224779816" watchObservedRunningTime="2026-04-22 17:56:42.539747041 +0000 UTC m=+217.227207448" Apr 22 17:56:49.049153 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.049119 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6db495bb45-tcfg2"] Apr 22 17:56:49.051776 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.051751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.060424 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.060177 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 17:56:49.062390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.062368 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db495bb45-tcfg2"] Apr 22 17:56:49.090069 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.090045 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-serving-cert\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.090179 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.090097 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-config\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.090238 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.090172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-service-ca\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.090297 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.090235 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-oauth-serving-cert\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.090297 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.090255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z796b\" (UniqueName: \"kubernetes.io/projected/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-kube-api-access-z796b\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.090387 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.090323 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-trusted-ca-bundle\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.090387 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.090371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-oauth-config\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.191393 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.191360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-config\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.191526 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.191426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-service-ca\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.191526 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.191461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-oauth-serving-cert\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.191526 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.191478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z796b\" (UniqueName: \"kubernetes.io/projected/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-kube-api-access-z796b\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.191526 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.191502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-trusted-ca-bundle\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.191741 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.191542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-oauth-config\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.191741 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.191599 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-serving-cert\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.192258 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.192167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-config\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.192390 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.192370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-service-ca\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.192514 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.192486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-oauth-serving-cert\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.192728 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.192684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-trusted-ca-bundle\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.194747 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.194727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-serving-cert\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.194858 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.194782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-oauth-config\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.199966 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.199944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z796b\" (UniqueName: \"kubernetes.io/projected/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-kube-api-access-z796b\") pod \"console-6db495bb45-tcfg2\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.251370 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.251345 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:49.251473 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.251386 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:49.256765 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.256743 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:49.364101 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.364028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:49.542619 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.542582 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:56:49.717569 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:49.717547 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db495bb45-tcfg2"] Apr 22 17:56:49.720614 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:56:49.720587 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddab444cf_91db_46c1_afb4_0d0e3e0cef8a.slice/crio-ecd819c7add12bcdd98a2607ed104e9d008a7a35447f3f2d2ab3674dd1c554cc WatchSource:0}: Error finding container ecd819c7add12bcdd98a2607ed104e9d008a7a35447f3f2d2ab3674dd1c554cc: Status 404 returned error can't find the container with id ecd819c7add12bcdd98a2607ed104e9d008a7a35447f3f2d2ab3674dd1c554cc Apr 22 17:56:50.543053 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:50.543012 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-vsfb6" event={"ID":"2c1e09a8-f37d-488d-875f-501a78bdc7b1","Type":"ContainerStarted","Data":"2ed59f9ec376a542a68b070ffe974cb1d9c1ac8796b1d38f1cbe02c5c3b4e9ab"} Apr 22 17:56:50.543617 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:50.543214 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-vsfb6" Apr 22 17:56:50.544678 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:50.544649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db495bb45-tcfg2" event={"ID":"dab444cf-91db-46c1-afb4-0d0e3e0cef8a","Type":"ContainerStarted","Data":"6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181"} Apr 22 17:56:50.544884 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:50.544857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db495bb45-tcfg2" event={"ID":"dab444cf-91db-46c1-afb4-0d0e3e0cef8a","Type":"ContainerStarted","Data":"ecd819c7add12bcdd98a2607ed104e9d008a7a35447f3f2d2ab3674dd1c554cc"} Apr 22 17:56:50.560527 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:50.560499 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-vsfb6" Apr 22 17:56:50.561953 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:50.561914 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-vsfb6" podStartSLOduration=1.695376577 podStartE2EDuration="17.561902679s" podCreationTimestamp="2026-04-22 17:56:33 +0000 UTC" firstStartedPulling="2026-04-22 17:56:33.794431764 +0000 UTC m=+208.481892149" lastFinishedPulling="2026-04-22 17:56:49.660957848 +0000 UTC m=+224.348418251" observedRunningTime="2026-04-22 17:56:50.560547885 +0000 UTC m=+225.248008292" watchObservedRunningTime="2026-04-22 17:56:50.561902679 +0000 UTC m=+225.249363099" Apr 22 17:56:50.594610 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:50.594560 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6db495bb45-tcfg2" podStartSLOduration=1.594544049 podStartE2EDuration="1.594544049s" podCreationTimestamp="2026-04-22 17:56:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:50.593431842 +0000 UTC m=+225.280892260" watchObservedRunningTime="2026-04-22 17:56:50.594544049 +0000 UTC m=+225.282004458" Apr 22 17:56:52.552283 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:52.552250 2574 generic.go:358] "Generic (PLEG): container finished" podID="3c91290f-1a67-4f2b-bb75-f6e0647e34d5" containerID="2f3ca601215b8da188281a1baf46c5eec6320f9f24b84a8e0131c5d37a9621bc" exitCode=0 Apr 22 17:56:52.552708 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:52.552329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" event={"ID":"3c91290f-1a67-4f2b-bb75-f6e0647e34d5","Type":"ContainerDied","Data":"2f3ca601215b8da188281a1baf46c5eec6320f9f24b84a8e0131c5d37a9621bc"} Apr 22 17:56:52.552824 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:52.552799 2574 scope.go:117] "RemoveContainer" containerID="2f3ca601215b8da188281a1baf46c5eec6320f9f24b84a8e0131c5d37a9621bc" Apr 22 17:56:53.559472 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:53.559429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-82zvn" event={"ID":"3c91290f-1a67-4f2b-bb75-f6e0647e34d5","Type":"ContainerStarted","Data":"96a2a473a435b56aff430b8c18f0e83466cf3f4c41294c5f4d8818cf6b162775"} Apr 22 17:56:59.364683 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:59.364651 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:59.365145 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:59.364744 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:59.369314 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:59.369295 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:59.581552 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:59.581525 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:56:59.652653 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:56:59.652573 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5766fb98dc-7hgzs"] Apr 22 17:57:00.045970 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:00.045940 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:57:00.049743 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:00.049719 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7d7d964cb8-z9699" Apr 22 17:57:07.600120 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:07.600091 2574 generic.go:358] "Generic (PLEG): container finished" podID="b85d7429-1a02-4051-b96d-692a3bc3bacc" containerID="b70caa65f1494ab2ae19543328d4bf783f548006c4b0c4be2e335e842e30ec0e" exitCode=0 Apr 22 17:57:07.600497 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:07.600140 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" event={"ID":"b85d7429-1a02-4051-b96d-692a3bc3bacc","Type":"ContainerDied","Data":"b70caa65f1494ab2ae19543328d4bf783f548006c4b0c4be2e335e842e30ec0e"} Apr 22 17:57:07.600497 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:07.600433 2574 scope.go:117] "RemoveContainer" containerID="b70caa65f1494ab2ae19543328d4bf783f548006c4b0c4be2e335e842e30ec0e" Apr 22 17:57:08.604792 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:08.604754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x4jsj" event={"ID":"b85d7429-1a02-4051-b96d-692a3bc3bacc","Type":"ContainerStarted","Data":"05776372c82d752ee0e11a18e91c92df0841fdf9d55a213e9cca3eb40d306a0a"} Apr 22 17:57:17.640042 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:17.639958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:57:17.642252 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:17.642228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027c3a56-b141-4f0e-beda-4bbc2fdc45c6-metrics-certs\") pod \"network-metrics-daemon-fztfm\" (UID: \"027c3a56-b141-4f0e-beda-4bbc2fdc45c6\") " pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:57:17.756561 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:17.756529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qsx25\"" Apr 22 17:57:17.764142 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:17.764120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fztfm" Apr 22 17:57:17.883056 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:17.883030 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fztfm"] Apr 22 17:57:17.885632 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:57:17.885609 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027c3a56_b141_4f0e_beda_4bbc2fdc45c6.slice/crio-0b454c490db76e5d0e3d94c1e2c2d3d21fc3101148f2d9dce3f71b003d3035bb WatchSource:0}: Error finding container 0b454c490db76e5d0e3d94c1e2c2d3d21fc3101148f2d9dce3f71b003d3035bb: Status 404 returned error can't find the container with id 0b454c490db76e5d0e3d94c1e2c2d3d21fc3101148f2d9dce3f71b003d3035bb Apr 22 17:57:18.635098 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:18.635064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fztfm" event={"ID":"027c3a56-b141-4f0e-beda-4bbc2fdc45c6","Type":"ContainerStarted","Data":"0b454c490db76e5d0e3d94c1e2c2d3d21fc3101148f2d9dce3f71b003d3035bb"} Apr 22 17:57:19.643821 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:19.643787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fztfm" event={"ID":"027c3a56-b141-4f0e-beda-4bbc2fdc45c6","Type":"ContainerStarted","Data":"bf8b82bdac71d7400c640f57121818009980d409e80b8300d747915ff7390ed2"} Apr 22 17:57:20.648396 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:20.648360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fztfm" event={"ID":"027c3a56-b141-4f0e-beda-4bbc2fdc45c6","Type":"ContainerStarted","Data":"cf5b6397972d482c83bb8e2701fbdbc744f1c34b88e12ddca42d8636e9ec050c"} Apr 22 17:57:20.665490 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:20.665433 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fztfm" podStartSLOduration=254.229253847 podStartE2EDuration="4m15.66541618s" podCreationTimestamp="2026-04-22 17:53:05 +0000 UTC" firstStartedPulling="2026-04-22 17:57:17.887923871 +0000 UTC m=+252.575384256" lastFinishedPulling="2026-04-22 17:57:19.324086196 +0000 UTC m=+254.011546589" observedRunningTime="2026-04-22 17:57:20.664511326 +0000 UTC m=+255.351971735" watchObservedRunningTime="2026-04-22 17:57:20.66541618 +0000 UTC m=+255.352876588" Apr 22 17:57:24.672041 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:24.671996 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5766fb98dc-7hgzs" podUID="b649836c-a791-440c-bca3-15943a388a52" containerName="console" containerID="cri-o://369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1" gracePeriod=15 Apr 22 17:57:24.934727 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:24.934705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5766fb98dc-7hgzs_b649836c-a791-440c-bca3-15943a388a52/console/0.log" Apr 22 17:57:24.934844 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:24.934763 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:57:25.098645 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.098616 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-oauth-serving-cert\") pod \"b649836c-a791-440c-bca3-15943a388a52\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " Apr 22 17:57:25.098645 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.098648 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-serving-cert\") pod \"b649836c-a791-440c-bca3-15943a388a52\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " Apr 22 17:57:25.098883 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.098668 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-console-config\") pod \"b649836c-a791-440c-bca3-15943a388a52\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " Apr 22 17:57:25.098883 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.098706 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-service-ca\") pod \"b649836c-a791-440c-bca3-15943a388a52\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " Apr 22 17:57:25.098883 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.098730 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-oauth-config\") pod \"b649836c-a791-440c-bca3-15943a388a52\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " Apr 22 17:57:25.098883 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.098792 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlrb7\" (UniqueName: \"kubernetes.io/projected/b649836c-a791-440c-bca3-15943a388a52-kube-api-access-xlrb7\") pod \"b649836c-a791-440c-bca3-15943a388a52\" (UID: \"b649836c-a791-440c-bca3-15943a388a52\") " Apr 22 17:57:25.099154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.099124 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b649836c-a791-440c-bca3-15943a388a52" (UID: "b649836c-a791-440c-bca3-15943a388a52"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:25.099154 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.099142 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-console-config" (OuterVolumeSpecName: "console-config") pod "b649836c-a791-440c-bca3-15943a388a52" (UID: "b649836c-a791-440c-bca3-15943a388a52"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:25.099289 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.099168 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-service-ca" (OuterVolumeSpecName: "service-ca") pod "b649836c-a791-440c-bca3-15943a388a52" (UID: "b649836c-a791-440c-bca3-15943a388a52"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:25.101084 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.101059 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b649836c-a791-440c-bca3-15943a388a52-kube-api-access-xlrb7" (OuterVolumeSpecName: "kube-api-access-xlrb7") pod "b649836c-a791-440c-bca3-15943a388a52" (UID: "b649836c-a791-440c-bca3-15943a388a52"). InnerVolumeSpecName "kube-api-access-xlrb7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:25.101291 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.101267 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b649836c-a791-440c-bca3-15943a388a52" (UID: "b649836c-a791-440c-bca3-15943a388a52"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:25.101370 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.101286 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b649836c-a791-440c-bca3-15943a388a52" (UID: "b649836c-a791-440c-bca3-15943a388a52"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:25.200048 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.199994 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xlrb7\" (UniqueName: \"kubernetes.io/projected/b649836c-a791-440c-bca3-15943a388a52-kube-api-access-xlrb7\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:57:25.200048 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.200017 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-oauth-serving-cert\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:57:25.200048 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.200027 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-serving-cert\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:57:25.200048 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.200036 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-console-config\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:57:25.200048 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.200045 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b649836c-a791-440c-bca3-15943a388a52-service-ca\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:57:25.200248 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.200059 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b649836c-a791-440c-bca3-15943a388a52-console-oauth-config\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:57:25.667319 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.667290 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5766fb98dc-7hgzs_b649836c-a791-440c-bca3-15943a388a52/console/0.log" Apr 22 17:57:25.667487 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.667332 2574 generic.go:358] "Generic (PLEG): container finished" podID="b649836c-a791-440c-bca3-15943a388a52" containerID="369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1" exitCode=2 Apr 22 17:57:25.667487 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.667390 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5766fb98dc-7hgzs" event={"ID":"b649836c-a791-440c-bca3-15943a388a52","Type":"ContainerDied","Data":"369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1"} Apr 22 17:57:25.667487 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.667429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5766fb98dc-7hgzs" event={"ID":"b649836c-a791-440c-bca3-15943a388a52","Type":"ContainerDied","Data":"bb13809b9177e95b823b9e7827ade70f1e4afc541b64f1fc063e5b654a27be90"} Apr 22 17:57:25.667487 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.667426 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5766fb98dc-7hgzs" Apr 22 17:57:25.667487 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.667445 2574 scope.go:117] "RemoveContainer" containerID="369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1" Apr 22 17:57:25.676162 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.675941 2574 scope.go:117] "RemoveContainer" containerID="369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1" Apr 22 17:57:25.676392 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:57:25.676311 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1\": container with ID starting with 369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1 not found: ID does not exist" containerID="369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1" Apr 22 17:57:25.676392 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.676336 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1"} err="failed to get container status \"369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1\": rpc error: code = NotFound desc = could not find container \"369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1\": container with ID starting with 369530bc3dc32c3a77272b3356f6da532339898bd5db742e03abb95b0ec8c2e1 not found: ID does not exist" Apr 22 17:57:25.689088 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.689068 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5766fb98dc-7hgzs"] Apr 22 17:57:25.692468 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.692447 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5766fb98dc-7hgzs"] Apr 22 17:57:25.857684 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:25.857655 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b649836c-a791-440c-bca3-15943a388a52" path="/var/lib/kubelet/pods/b649836c-a791-440c-bca3-15943a388a52/volumes" Apr 22 17:57:33.455638 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.455601 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ccd7d4dc4-5qt5l"] Apr 22 17:57:33.456083 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.455921 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b649836c-a791-440c-bca3-15943a388a52" containerName="console" Apr 22 17:57:33.456083 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.455934 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b649836c-a791-440c-bca3-15943a388a52" containerName="console" Apr 22 17:57:33.456083 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.455981 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b649836c-a791-440c-bca3-15943a388a52" containerName="console" Apr 22 17:57:33.481859 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.481816 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.488058 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.488033 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ccd7d4dc4-5qt5l"] Apr 22 17:57:33.560043 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.560016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-trusted-ca-bundle\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.560165 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.560047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-oauth-config\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.560165 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.560068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9czng\" (UniqueName: \"kubernetes.io/projected/0fd84a93-9b95-48c2-86c0-720486cf66cd-kube-api-access-9czng\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.560165 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.560120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-config\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.560263 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.560168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-serving-cert\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.560263 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.560206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-service-ca\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.560263 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.560223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-oauth-serving-cert\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.660667 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.660641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-trusted-ca-bundle\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.660769 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.660686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-oauth-config\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.660769 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.660710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9czng\" (UniqueName: \"kubernetes.io/projected/0fd84a93-9b95-48c2-86c0-720486cf66cd-kube-api-access-9czng\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.660769 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.660740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-config\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.660952 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.660780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-serving-cert\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.660952 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.660807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-service-ca\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.660952 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.660852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-oauth-serving-cert\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.661581 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.661556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-config\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.661581 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.661569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-service-ca\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.661736 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.661662 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-oauth-serving-cert\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.661919 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.661896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-trusted-ca-bundle\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.663189 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.663171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-serving-cert\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.663278 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.663208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-oauth-config\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.670915 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.670896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9czng\" (UniqueName: \"kubernetes.io/projected/0fd84a93-9b95-48c2-86c0-720486cf66cd-kube-api-access-9czng\") pod \"console-6ccd7d4dc4-5qt5l\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.792987 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.792966 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:33.933561 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:33.933533 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ccd7d4dc4-5qt5l"] Apr 22 17:57:33.936259 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:57:33.936236 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd84a93_9b95_48c2_86c0_720486cf66cd.slice/crio-d15fee5e0015b0988fc7e63e87e19975da90ec5cd799b1b7c60172906952b8a9 WatchSource:0}: Error finding container d15fee5e0015b0988fc7e63e87e19975da90ec5cd799b1b7c60172906952b8a9: Status 404 returned error can't find the container with id d15fee5e0015b0988fc7e63e87e19975da90ec5cd799b1b7c60172906952b8a9 Apr 22 17:57:34.704610 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:34.704575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccd7d4dc4-5qt5l" event={"ID":"0fd84a93-9b95-48c2-86c0-720486cf66cd","Type":"ContainerStarted","Data":"72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a"} Apr 22 17:57:34.704610 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:34.704610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccd7d4dc4-5qt5l" event={"ID":"0fd84a93-9b95-48c2-86c0-720486cf66cd","Type":"ContainerStarted","Data":"d15fee5e0015b0988fc7e63e87e19975da90ec5cd799b1b7c60172906952b8a9"} Apr 22 17:57:34.727035 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:34.726995 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ccd7d4dc4-5qt5l" podStartSLOduration=1.726983246 podStartE2EDuration="1.726983246s" podCreationTimestamp="2026-04-22 17:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:57:34.725422559 +0000 UTC m=+269.412882980" watchObservedRunningTime="2026-04-22 17:57:34.726983246 +0000 UTC m=+269.414443653" Apr 22 17:57:43.793874 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:43.793822 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:43.793874 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:43.793878 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:43.802623 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:43.802598 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:44.736853 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:44.736811 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 17:57:44.784926 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:44.784895 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db495bb45-tcfg2"] Apr 22 17:57:47.164914 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:47.164882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:57:47.167302 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:47.167270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a435446-c735-4a7b-bbb9-eab6af3f7b77-cert\") pod \"ingress-canary-85lcr\" (UID: \"0a435446-c735-4a7b-bbb9-eab6af3f7b77\") " pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:57:47.455384 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:47.455303 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-frn4n\"" Apr 22 17:57:47.462738 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:47.462720 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-85lcr" Apr 22 17:57:47.580421 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:47.580391 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-85lcr"] Apr 22 17:57:47.583539 ip-10-0-132-24 kubenswrapper[2574]: W0422 17:57:47.583512 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a435446_c735_4a7b_bbb9_eab6af3f7b77.slice/crio-331da878bcbf1b6c1ec170063d95dae213d65d369b933055be953512d40b817d WatchSource:0}: Error finding container 331da878bcbf1b6c1ec170063d95dae213d65d369b933055be953512d40b817d: Status 404 returned error can't find the container with id 331da878bcbf1b6c1ec170063d95dae213d65d369b933055be953512d40b817d Apr 22 17:57:47.741826 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:47.741790 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-85lcr" event={"ID":"0a435446-c735-4a7b-bbb9-eab6af3f7b77","Type":"ContainerStarted","Data":"331da878bcbf1b6c1ec170063d95dae213d65d369b933055be953512d40b817d"} Apr 22 17:57:49.749253 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:49.749212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-85lcr" event={"ID":"0a435446-c735-4a7b-bbb9-eab6af3f7b77","Type":"ContainerStarted","Data":"7931081c7c9d1714ab317cfce10b589485368849bbe1d9184cd2bd215081dfc3"} Apr 22 17:57:49.765239 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:57:49.765195 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-85lcr" podStartSLOduration=251.252110633 podStartE2EDuration="4m12.765185411s" podCreationTimestamp="2026-04-22 17:53:37 +0000 UTC" firstStartedPulling="2026-04-22 17:57:47.585362439 +0000 UTC m=+282.272822825" lastFinishedPulling="2026-04-22 17:57:49.098437204 +0000 UTC m=+283.785897603" observedRunningTime="2026-04-22 17:57:49.763820406 +0000 UTC m=+284.451280813" watchObservedRunningTime="2026-04-22 17:57:49.765185411 +0000 UTC m=+284.452645818" Apr 22 17:58:05.758405 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:05.758377 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 17:58:05.760069 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:05.760045 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 17:58:05.764667 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:05.764645 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:58:09.805768 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:09.805733 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6db495bb45-tcfg2" podUID="dab444cf-91db-46c1-afb4-0d0e3e0cef8a" containerName="console" containerID="cri-o://6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181" gracePeriod=15 Apr 22 17:58:10.053714 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.053693 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db495bb45-tcfg2_dab444cf-91db-46c1-afb4-0d0e3e0cef8a/console/0.log" Apr 22 17:58:10.053856 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.053750 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:58:10.142449 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142365 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-service-ca\") pod \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " Apr 22 17:58:10.142449 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142405 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-serving-cert\") pod \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " Apr 22 17:58:10.142659 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142452 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-config\") pod \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " Apr 22 17:58:10.142659 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142468 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-trusted-ca-bundle\") pod \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " Apr 22 17:58:10.142659 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142489 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z796b\" (UniqueName: \"kubernetes.io/projected/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-kube-api-access-z796b\") pod \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " Apr 22 17:58:10.142659 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142506 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-oauth-config\") pod \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " Apr 22 17:58:10.142659 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142536 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-oauth-serving-cert\") pod \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\" (UID: \"dab444cf-91db-46c1-afb4-0d0e3e0cef8a\") " Apr 22 17:58:10.142927 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142888 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-config" (OuterVolumeSpecName: "console-config") pod "dab444cf-91db-46c1-afb4-0d0e3e0cef8a" (UID: "dab444cf-91db-46c1-afb4-0d0e3e0cef8a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:10.142927 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.142786 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-service-ca" (OuterVolumeSpecName: "service-ca") pod "dab444cf-91db-46c1-afb4-0d0e3e0cef8a" (UID: "dab444cf-91db-46c1-afb4-0d0e3e0cef8a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:10.143064 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.143029 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dab444cf-91db-46c1-afb4-0d0e3e0cef8a" (UID: "dab444cf-91db-46c1-afb4-0d0e3e0cef8a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:10.143169 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.143064 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dab444cf-91db-46c1-afb4-0d0e3e0cef8a" (UID: "dab444cf-91db-46c1-afb4-0d0e3e0cef8a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:58:10.144855 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.144812 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dab444cf-91db-46c1-afb4-0d0e3e0cef8a" (UID: "dab444cf-91db-46c1-afb4-0d0e3e0cef8a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:10.144955 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.144861 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-kube-api-access-z796b" (OuterVolumeSpecName: "kube-api-access-z796b") pod "dab444cf-91db-46c1-afb4-0d0e3e0cef8a" (UID: "dab444cf-91db-46c1-afb4-0d0e3e0cef8a"). InnerVolumeSpecName "kube-api-access-z796b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:58:10.144955 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.144868 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dab444cf-91db-46c1-afb4-0d0e3e0cef8a" (UID: "dab444cf-91db-46c1-afb4-0d0e3e0cef8a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:58:10.243614 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.243593 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-config\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:58:10.243614 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.243614 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-trusted-ca-bundle\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:58:10.243732 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.243624 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z796b\" (UniqueName: \"kubernetes.io/projected/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-kube-api-access-z796b\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:58:10.243732 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.243634 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-oauth-config\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:58:10.243732 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.243643 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-oauth-serving-cert\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:58:10.243732 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.243652 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-service-ca\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:58:10.243732 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.243661 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dab444cf-91db-46c1-afb4-0d0e3e0cef8a-console-serving-cert\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 17:58:10.815019 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.814992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db495bb45-tcfg2_dab444cf-91db-46c1-afb4-0d0e3e0cef8a/console/0.log" Apr 22 17:58:10.815378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.815031 2574 generic.go:358] "Generic (PLEG): container finished" podID="dab444cf-91db-46c1-afb4-0d0e3e0cef8a" containerID="6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181" exitCode=2 Apr 22 17:58:10.815378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.815098 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db495bb45-tcfg2" Apr 22 17:58:10.815378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.815110 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db495bb45-tcfg2" event={"ID":"dab444cf-91db-46c1-afb4-0d0e3e0cef8a","Type":"ContainerDied","Data":"6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181"} Apr 22 17:58:10.815378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.815142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db495bb45-tcfg2" event={"ID":"dab444cf-91db-46c1-afb4-0d0e3e0cef8a","Type":"ContainerDied","Data":"ecd819c7add12bcdd98a2607ed104e9d008a7a35447f3f2d2ab3674dd1c554cc"} Apr 22 17:58:10.815378 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.815158 2574 scope.go:117] "RemoveContainer" containerID="6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181" Apr 22 17:58:10.823380 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.823364 2574 scope.go:117] "RemoveContainer" containerID="6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181" Apr 22 17:58:10.823641 ip-10-0-132-24 kubenswrapper[2574]: E0422 17:58:10.823622 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181\": container with ID starting with 6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181 not found: ID does not exist" containerID="6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181" Apr 22 17:58:10.823690 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.823653 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181"} err="failed to get container status \"6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181\": rpc error: code = NotFound desc = could not find container \"6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181\": container with ID starting with 6e9bf6fa65edca8f73a66a3c2ee443398ecd2e545f51cda51bfc236573a5a181 not found: ID does not exist" Apr 22 17:58:10.834916 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.834894 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db495bb45-tcfg2"] Apr 22 17:58:10.838450 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:10.838430 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6db495bb45-tcfg2"] Apr 22 17:58:11.855175 ip-10-0-132-24 kubenswrapper[2574]: I0422 17:58:11.855140 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab444cf-91db-46c1-afb4-0d0e3e0cef8a" path="/var/lib/kubelet/pods/dab444cf-91db-46c1-afb4-0d0e3e0cef8a/volumes" Apr 22 18:00:39.143385 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.143300 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9"] Apr 22 18:00:39.143829 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.143593 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dab444cf-91db-46c1-afb4-0d0e3e0cef8a" containerName="console" Apr 22 18:00:39.143829 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.143607 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab444cf-91db-46c1-afb4-0d0e3e0cef8a" containerName="console" Apr 22 18:00:39.143829 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.143663 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dab444cf-91db-46c1-afb4-0d0e3e0cef8a" containerName="console" Apr 22 18:00:39.146629 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.146613 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.149355 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.149334 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:00:39.150519 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.150502 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:00:39.150611 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.150505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2n5pl\"" Apr 22 18:00:39.165615 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.165597 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9"] Apr 22 18:00:39.208313 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.208290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.208418 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.208337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.208418 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.208358 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgg4\" (UniqueName: \"kubernetes.io/projected/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-kube-api-access-rkgg4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.309468 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.309423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.309659 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.309512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.309659 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.309544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgg4\" (UniqueName: \"kubernetes.io/projected/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-kube-api-access-rkgg4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.309823 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.309801 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.309917 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.309819 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.319202 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.319177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgg4\" (UniqueName: \"kubernetes.io/projected/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-kube-api-access-rkgg4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.455524 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.455437 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:39.574955 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.574869 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9"] Apr 22 18:00:39.577670 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:00:39.577642 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b71317d_6f81_4fc1_be7b_4dd29e929bb1.slice/crio-3243a0f827c6b0eb7ac94de482fb364855677e51050b4152e2483b704afbb74e WatchSource:0}: Error finding container 3243a0f827c6b0eb7ac94de482fb364855677e51050b4152e2483b704afbb74e: Status 404 returned error can't find the container with id 3243a0f827c6b0eb7ac94de482fb364855677e51050b4152e2483b704afbb74e Apr 22 18:00:39.579531 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:39.579515 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:00:40.224961 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:40.224911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" event={"ID":"0b71317d-6f81-4fc1-be7b-4dd29e929bb1","Type":"ContainerStarted","Data":"3243a0f827c6b0eb7ac94de482fb364855677e51050b4152e2483b704afbb74e"} Apr 22 18:00:45.240946 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:45.240906 2574 generic.go:358] "Generic (PLEG): container finished" podID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerID="4ec7d0c9721d33a949e27b9bd210bace071abb4fd17a750016ee783706547e88" exitCode=0 Apr 22 18:00:45.241303 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:45.240976 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" event={"ID":"0b71317d-6f81-4fc1-be7b-4dd29e929bb1","Type":"ContainerDied","Data":"4ec7d0c9721d33a949e27b9bd210bace071abb4fd17a750016ee783706547e88"} Apr 22 18:00:48.252055 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:48.252017 2574 generic.go:358] "Generic (PLEG): container finished" podID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerID="e109577595e62289de7ece43be8bddd960d47f8dcba2b54dff50a44db0474ef5" exitCode=0 Apr 22 18:00:48.252402 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:48.252104 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" event={"ID":"0b71317d-6f81-4fc1-be7b-4dd29e929bb1","Type":"ContainerDied","Data":"e109577595e62289de7ece43be8bddd960d47f8dcba2b54dff50a44db0474ef5"} Apr 22 18:00:55.277396 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:55.277357 2574 generic.go:358] "Generic (PLEG): container finished" podID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerID="32d6de3a22140673f4c7dbf456da2a90ad06a9f32395e210154a27a2f0a29b26" exitCode=0 Apr 22 18:00:55.277754 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:55.277401 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" event={"ID":"0b71317d-6f81-4fc1-be7b-4dd29e929bb1","Type":"ContainerDied","Data":"32d6de3a22140673f4c7dbf456da2a90ad06a9f32395e210154a27a2f0a29b26"} Apr 22 18:00:56.397239 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.397217 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:00:56.462795 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.462765 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgg4\" (UniqueName: \"kubernetes.io/projected/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-kube-api-access-rkgg4\") pod \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " Apr 22 18:00:56.462795 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.462802 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-bundle\") pod \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " Apr 22 18:00:56.463061 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.462894 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-util\") pod \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\" (UID: \"0b71317d-6f81-4fc1-be7b-4dd29e929bb1\") " Apr 22 18:00:56.463368 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.463344 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-bundle" (OuterVolumeSpecName: "bundle") pod "0b71317d-6f81-4fc1-be7b-4dd29e929bb1" (UID: "0b71317d-6f81-4fc1-be7b-4dd29e929bb1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:56.465010 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.464978 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-kube-api-access-rkgg4" (OuterVolumeSpecName: "kube-api-access-rkgg4") pod "0b71317d-6f81-4fc1-be7b-4dd29e929bb1" (UID: "0b71317d-6f81-4fc1-be7b-4dd29e929bb1"). InnerVolumeSpecName "kube-api-access-rkgg4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:00:56.467620 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.467598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-util" (OuterVolumeSpecName: "util") pod "0b71317d-6f81-4fc1-be7b-4dd29e929bb1" (UID: "0b71317d-6f81-4fc1-be7b-4dd29e929bb1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:56.564084 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.564004 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-util\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:00:56.564084 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.564032 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkgg4\" (UniqueName: \"kubernetes.io/projected/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-kube-api-access-rkgg4\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:00:56.564084 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:56.564044 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b71317d-6f81-4fc1-be7b-4dd29e929bb1-bundle\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:00:57.283674 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:57.283633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" event={"ID":"0b71317d-6f81-4fc1-be7b-4dd29e929bb1","Type":"ContainerDied","Data":"3243a0f827c6b0eb7ac94de482fb364855677e51050b4152e2483b704afbb74e"} Apr 22 18:00:57.283674 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:57.283666 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3243a0f827c6b0eb7ac94de482fb364855677e51050b4152e2483b704afbb74e" Apr 22 18:00:57.283674 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:00:57.283672 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfs6l9" Apr 22 18:01:00.976590 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976543 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb"] Apr 22 18:01:00.977003 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976868 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerName="util" Apr 22 18:01:00.977003 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976879 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerName="util" Apr 22 18:01:00.977003 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976894 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerName="extract" Apr 22 18:01:00.977003 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976899 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerName="extract" Apr 22 18:01:00.977003 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976908 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerName="pull" Apr 22 18:01:00.977003 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976914 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerName="pull" Apr 22 18:01:00.977003 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:00.976956 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b71317d-6f81-4fc1-be7b-4dd29e929bb1" containerName="extract" Apr 22 18:01:01.031455 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.031425 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb"] Apr 22 18:01:01.031585 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.031542 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.034322 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.034300 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-dsdtd\"" Apr 22 18:01:01.034480 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.034467 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:01:01.034590 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.034570 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:01:01.034644 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.034613 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:01:01.101765 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.101733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6c698dd5-7bbb-46fb-9009-f3696a34effb-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb\" (UID: \"6c698dd5-7bbb-46fb-9009-f3696a34effb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.101923 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.101770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlpnq\" (UniqueName: \"kubernetes.io/projected/6c698dd5-7bbb-46fb-9009-f3696a34effb-kube-api-access-nlpnq\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb\" (UID: \"6c698dd5-7bbb-46fb-9009-f3696a34effb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.202852 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.202789 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlpnq\" (UniqueName: \"kubernetes.io/projected/6c698dd5-7bbb-46fb-9009-f3696a34effb-kube-api-access-nlpnq\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb\" (UID: \"6c698dd5-7bbb-46fb-9009-f3696a34effb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.203041 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.202954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6c698dd5-7bbb-46fb-9009-f3696a34effb-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb\" (UID: \"6c698dd5-7bbb-46fb-9009-f3696a34effb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.205395 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.205371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6c698dd5-7bbb-46fb-9009-f3696a34effb-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb\" (UID: \"6c698dd5-7bbb-46fb-9009-f3696a34effb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.219598 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.219568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlpnq\" (UniqueName: \"kubernetes.io/projected/6c698dd5-7bbb-46fb-9009-f3696a34effb-kube-api-access-nlpnq\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb\" (UID: \"6c698dd5-7bbb-46fb-9009-f3696a34effb\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.342321 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.342245 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:01.470293 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:01.470268 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb"] Apr 22 18:01:01.473285 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:01:01.473257 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c698dd5_7bbb_46fb_9009_f3696a34effb.slice/crio-76ce8903b52e952524514eb75de2b99c0e37b6ca1e3e40d02ed7cda65161aa8c WatchSource:0}: Error finding container 76ce8903b52e952524514eb75de2b99c0e37b6ca1e3e40d02ed7cda65161aa8c: Status 404 returned error can't find the container with id 76ce8903b52e952524514eb75de2b99c0e37b6ca1e3e40d02ed7cda65161aa8c Apr 22 18:01:02.298634 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:02.298603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" event={"ID":"6c698dd5-7bbb-46fb-9009-f3696a34effb","Type":"ContainerStarted","Data":"76ce8903b52e952524514eb75de2b99c0e37b6ca1e3e40d02ed7cda65161aa8c"} Apr 22 18:01:06.314030 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.313999 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" event={"ID":"6c698dd5-7bbb-46fb-9009-f3696a34effb","Type":"ContainerStarted","Data":"fcaa0cc9ffe4ce9402285631c786036573e073bee1f670385b6d5903c004718a"} Apr 22 18:01:06.314408 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.314077 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:06.372214 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.372159 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" podStartSLOduration=2.041200501 podStartE2EDuration="6.372125528s" podCreationTimestamp="2026-04-22 18:01:00 +0000 UTC" firstStartedPulling="2026-04-22 18:01:01.475616549 +0000 UTC m=+476.163076940" lastFinishedPulling="2026-04-22 18:01:05.806541582 +0000 UTC m=+480.494001967" observedRunningTime="2026-04-22 18:01:06.371046852 +0000 UTC m=+481.058507262" watchObservedRunningTime="2026-04-22 18:01:06.372125528 +0000 UTC m=+481.059585935" Apr 22 18:01:06.417317 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.417289 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-4cb8t"] Apr 22 18:01:06.437011 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.436985 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-4cb8t"] Apr 22 18:01:06.437146 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.437087 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.439900 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.439871 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-bvq9v\"" Apr 22 18:01:06.440017 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.439881 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:01:06.440423 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.440407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:01:06.548903 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.548864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-cabundle0\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.549059 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.548920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnnd2\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-kube-api-access-lnnd2\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.549059 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.548977 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.649990 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.649908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-cabundle0\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.649990 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.649979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnnd2\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-kube-api-access-lnnd2\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.650198 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.650034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.650198 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:06.650188 2574 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 18:01:06.650288 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:06.650207 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:01:06.650288 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:06.650217 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:01:06.650288 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:06.650234 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-4cb8t: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 18:01:06.650427 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:06.650302 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates podName:d2f8d122-9af9-4aa4-9e4a-087c71cd56fb nodeName:}" failed. No retries permitted until 2026-04-22 18:01:07.150281106 +0000 UTC m=+481.837741491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates") pod "keda-operator-ffbb595cb-4cb8t" (UID: "d2f8d122-9af9-4aa4-9e4a-087c71cd56fb") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 18:01:06.651201 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.651174 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-cabundle0\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.661879 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.661858 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnnd2\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-kube-api-access-lnnd2\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:06.795100 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.795069 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2"] Apr 22 18:01:06.819770 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.819746 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2"] Apr 22 18:01:06.819911 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.819866 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:06.822435 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.822411 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:01:06.952181 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.952101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:06.952181 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.952168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrmrw\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-kube-api-access-lrmrw\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:06.952382 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.952299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7497dabc-3320-4ece-8de1-de211e551a51-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:06.994375 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:06.994342 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-m5784"] Apr 22 18:01:07.017637 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.017592 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-m5784"] Apr 22 18:01:07.017808 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.017735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.020378 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.020355 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:01:07.052696 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.052667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrmrw\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-kube-api-access-lrmrw\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:07.052877 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.052748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7497dabc-3320-4ece-8de1-de211e551a51-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:07.052877 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.052776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:07.052998 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.052901 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:01:07.052998 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.052916 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:01:07.052998 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.052937 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2: references non-existent secret key: tls.crt Apr 22 18:01:07.052998 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.052994 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates podName:7497dabc-3320-4ece-8de1-de211e551a51 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:07.552976665 +0000 UTC m=+482.240437049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates") pod "keda-metrics-apiserver-7c9f485588-5f5w2" (UID: "7497dabc-3320-4ece-8de1-de211e551a51") : references non-existent secret key: tls.crt Apr 22 18:01:07.053202 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.053177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7497dabc-3320-4ece-8de1-de211e551a51-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:07.063218 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.063199 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrmrw\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-kube-api-access-lrmrw\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:07.155660 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.155624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:07.155826 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.155718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpln\" (UniqueName: \"kubernetes.io/projected/b6ca5125-c09a-479b-8546-46bd9620b540-kube-api-access-bjpln\") pod \"keda-admission-cf49989db-m5784\" (UID: \"b6ca5125-c09a-479b-8546-46bd9620b540\") " pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.155826 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.155762 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b6ca5125-c09a-479b-8546-46bd9620b540-certificates\") pod \"keda-admission-cf49989db-m5784\" (UID: \"b6ca5125-c09a-479b-8546-46bd9620b540\") " pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.155826 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.155779 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:01:07.155826 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.155799 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:01:07.155826 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.155808 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-4cb8t: references non-existent secret key: ca.crt Apr 22 18:01:07.156120 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.155875 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates podName:d2f8d122-9af9-4aa4-9e4a-087c71cd56fb nodeName:}" failed. No retries permitted until 2026-04-22 18:01:08.155857253 +0000 UTC m=+482.843317639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates") pod "keda-operator-ffbb595cb-4cb8t" (UID: "d2f8d122-9af9-4aa4-9e4a-087c71cd56fb") : references non-existent secret key: ca.crt Apr 22 18:01:07.256689 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.256650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpln\" (UniqueName: \"kubernetes.io/projected/b6ca5125-c09a-479b-8546-46bd9620b540-kube-api-access-bjpln\") pod \"keda-admission-cf49989db-m5784\" (UID: \"b6ca5125-c09a-479b-8546-46bd9620b540\") " pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.256883 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.256717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b6ca5125-c09a-479b-8546-46bd9620b540-certificates\") pod \"keda-admission-cf49989db-m5784\" (UID: \"b6ca5125-c09a-479b-8546-46bd9620b540\") " pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.259372 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.259341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b6ca5125-c09a-479b-8546-46bd9620b540-certificates\") pod \"keda-admission-cf49989db-m5784\" (UID: \"b6ca5125-c09a-479b-8546-46bd9620b540\") " pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.267062 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.267040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpln\" (UniqueName: \"kubernetes.io/projected/b6ca5125-c09a-479b-8546-46bd9620b540-kube-api-access-bjpln\") pod \"keda-admission-cf49989db-m5784\" (UID: \"b6ca5125-c09a-479b-8546-46bd9620b540\") " pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.330385 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.330355 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:07.469702 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.469679 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-m5784"] Apr 22 18:01:07.472988 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:01:07.472959 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ca5125_c09a_479b_8546_46bd9620b540.slice/crio-14f8b4e7203a134b090a6666180d44991bd7409aac3524c539aea099e7255443 WatchSource:0}: Error finding container 14f8b4e7203a134b090a6666180d44991bd7409aac3524c539aea099e7255443: Status 404 returned error can't find the container with id 14f8b4e7203a134b090a6666180d44991bd7409aac3524c539aea099e7255443 Apr 22 18:01:07.559782 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:07.559717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:07.559895 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.559845 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:01:07.559895 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.559861 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:01:07.559895 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.559881 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2: references non-existent secret key: tls.crt Apr 22 18:01:07.559985 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:07.559929 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates podName:7497dabc-3320-4ece-8de1-de211e551a51 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:08.559911526 +0000 UTC m=+483.247371932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates") pod "keda-metrics-apiserver-7c9f485588-5f5w2" (UID: "7497dabc-3320-4ece-8de1-de211e551a51") : references non-existent secret key: tls.crt Apr 22 18:01:08.163891 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:08.163861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:08.164080 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.163963 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:01:08.164080 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.163976 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:01:08.164080 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.163984 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-4cb8t: references non-existent secret key: ca.crt Apr 22 18:01:08.164080 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.164038 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates podName:d2f8d122-9af9-4aa4-9e4a-087c71cd56fb nodeName:}" failed. No retries permitted until 2026-04-22 18:01:10.1640257 +0000 UTC m=+484.851486085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates") pod "keda-operator-ffbb595cb-4cb8t" (UID: "d2f8d122-9af9-4aa4-9e4a-087c71cd56fb") : references non-existent secret key: ca.crt Apr 22 18:01:08.322599 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:08.322567 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-m5784" event={"ID":"b6ca5125-c09a-479b-8546-46bd9620b540","Type":"ContainerStarted","Data":"14f8b4e7203a134b090a6666180d44991bd7409aac3524c539aea099e7255443"} Apr 22 18:01:08.567117 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:08.567079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:08.567508 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.567234 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:01:08.567508 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.567257 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:01:08.567508 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.567277 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2: references non-existent secret key: tls.crt Apr 22 18:01:08.567508 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:08.567346 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates podName:7497dabc-3320-4ece-8de1-de211e551a51 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:10.56733119 +0000 UTC m=+485.254791591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates") pod "keda-metrics-apiserver-7c9f485588-5f5w2" (UID: "7497dabc-3320-4ece-8de1-de211e551a51") : references non-existent secret key: tls.crt Apr 22 18:01:10.182204 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.182166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:10.182604 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:10.182310 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:01:10.182604 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:10.182329 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:01:10.182604 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:10.182338 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-4cb8t: references non-existent secret key: ca.crt Apr 22 18:01:10.182604 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:01:10.182398 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates podName:d2f8d122-9af9-4aa4-9e4a-087c71cd56fb nodeName:}" failed. No retries permitted until 2026-04-22 18:01:14.182384736 +0000 UTC m=+488.869845121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates") pod "keda-operator-ffbb595cb-4cb8t" (UID: "d2f8d122-9af9-4aa4-9e4a-087c71cd56fb") : references non-existent secret key: ca.crt Apr 22 18:01:10.331289 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.331249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-m5784" event={"ID":"b6ca5125-c09a-479b-8546-46bd9620b540","Type":"ContainerStarted","Data":"933d691d1005b9b16c8e260f27cfce7c63572755518847db04c8d0312f1fa411"} Apr 22 18:01:10.331474 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.331392 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:10.349605 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.349563 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-m5784" podStartSLOduration=2.15742012 podStartE2EDuration="4.349551768s" podCreationTimestamp="2026-04-22 18:01:06 +0000 UTC" firstStartedPulling="2026-04-22 18:01:07.474361389 +0000 UTC m=+482.161821779" lastFinishedPulling="2026-04-22 18:01:09.666493041 +0000 UTC m=+484.353953427" observedRunningTime="2026-04-22 18:01:10.347981604 +0000 UTC m=+485.035442010" watchObservedRunningTime="2026-04-22 18:01:10.349551768 +0000 UTC m=+485.037012183" Apr 22 18:01:10.585648 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.585611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:10.588095 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.588075 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7497dabc-3320-4ece-8de1-de211e551a51-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5f5w2\" (UID: \"7497dabc-3320-4ece-8de1-de211e551a51\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:10.731659 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.731629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:10.869306 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:10.869282 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2"] Apr 22 18:01:10.871691 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:01:10.871659 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7497dabc_3320_4ece_8de1_de211e551a51.slice/crio-6f2f47402631f5c4f35cacd09581dd39e648131150f3992c9e78e0e9ccb10549 WatchSource:0}: Error finding container 6f2f47402631f5c4f35cacd09581dd39e648131150f3992c9e78e0e9ccb10549: Status 404 returned error can't find the container with id 6f2f47402631f5c4f35cacd09581dd39e648131150f3992c9e78e0e9ccb10549 Apr 22 18:01:11.335638 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:11.335602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" event={"ID":"7497dabc-3320-4ece-8de1-de211e551a51","Type":"ContainerStarted","Data":"6f2f47402631f5c4f35cacd09581dd39e648131150f3992c9e78e0e9ccb10549"} Apr 22 18:01:14.216208 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:14.216170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:14.218537 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:14.218517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d2f8d122-9af9-4aa4-9e4a-087c71cd56fb-certificates\") pod \"keda-operator-ffbb595cb-4cb8t\" (UID: \"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb\") " pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:14.247595 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:14.247561 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:14.350513 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:14.350476 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" event={"ID":"7497dabc-3320-4ece-8de1-de211e551a51","Type":"ContainerStarted","Data":"334b80b18a6adf892401d29c3062b4ebe56bde3048c628e95f35860a7858b13d"} Apr 22 18:01:14.350693 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:14.350670 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:14.367805 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:14.367776 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-4cb8t"] Apr 22 18:01:14.370051 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:14.369990 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" podStartSLOduration=5.663943196 podStartE2EDuration="8.369972262s" podCreationTimestamp="2026-04-22 18:01:06 +0000 UTC" firstStartedPulling="2026-04-22 18:01:10.873055755 +0000 UTC m=+485.560516141" lastFinishedPulling="2026-04-22 18:01:13.579084821 +0000 UTC m=+488.266545207" observedRunningTime="2026-04-22 18:01:14.368538575 +0000 UTC m=+489.055999019" watchObservedRunningTime="2026-04-22 18:01:14.369972262 +0000 UTC m=+489.057432670" Apr 22 18:01:15.354534 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:15.354500 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" event={"ID":"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb","Type":"ContainerStarted","Data":"44bd4ff78c93b679bb3880bc05517d20042fa6ab6caeb58f435934957ed64f34"} Apr 22 18:01:18.366938 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:18.366902 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" event={"ID":"d2f8d122-9af9-4aa4-9e4a-087c71cd56fb","Type":"ContainerStarted","Data":"93fea405b2edb72c97f0b11140f11f2b8e257a84af986dd6351718d75e832f5c"} Apr 22 18:01:18.367337 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:18.367009 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:18.387273 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:18.387214 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" podStartSLOduration=9.062925062 podStartE2EDuration="12.387198551s" podCreationTimestamp="2026-04-22 18:01:06 +0000 UTC" firstStartedPulling="2026-04-22 18:01:14.376683737 +0000 UTC m=+489.064144122" lastFinishedPulling="2026-04-22 18:01:17.700957222 +0000 UTC m=+492.388417611" observedRunningTime="2026-04-22 18:01:18.384197515 +0000 UTC m=+493.071657923" watchObservedRunningTime="2026-04-22 18:01:18.387198551 +0000 UTC m=+493.074658960" Apr 22 18:01:25.358995 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:25.358964 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5f5w2" Apr 22 18:01:27.320364 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:27.320337 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dh7pb" Apr 22 18:01:31.338166 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:31.338134 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-m5784" Apr 22 18:01:39.371858 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.371808 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-4cb8t" Apr 22 18:01:39.458529 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.458500 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bcc9444db-nthkt"] Apr 22 18:01:39.485202 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.485177 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcc9444db-nthkt"] Apr 22 18:01:39.485330 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.485277 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.607823 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.607793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-serving-cert\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.607959 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.607827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-trusted-ca-bundle\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.607959 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.607900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-oauth-config\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.607959 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.607921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-config\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.608058 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.607961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-service-ca\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.608058 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.608027 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57nd7\" (UniqueName: \"kubernetes.io/projected/5a7d2a25-558b-4082-b906-7eab8f6162e9-kube-api-access-57nd7\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.608058 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.608048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-oauth-serving-cert\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.709238 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.709178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-serving-cert\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.709238 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.709210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-trusted-ca-bundle\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.709433 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.709249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-oauth-config\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.709433 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.709297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-config\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.709433 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.709319 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-service-ca\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.709433 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.709348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57nd7\" (UniqueName: \"kubernetes.io/projected/5a7d2a25-558b-4082-b906-7eab8f6162e9-kube-api-access-57nd7\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.709433 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.709376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-oauth-serving-cert\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.710199 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.710172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-config\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.710307 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.710200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-trusted-ca-bundle\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.710307 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.710225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-service-ca\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.710307 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.710282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7d2a25-558b-4082-b906-7eab8f6162e9-oauth-serving-cert\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.711889 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.711866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-serving-cert\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.712047 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.712028 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7d2a25-558b-4082-b906-7eab8f6162e9-console-oauth-config\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.717923 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.717902 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57nd7\" (UniqueName: \"kubernetes.io/projected/5a7d2a25-558b-4082-b906-7eab8f6162e9-kube-api-access-57nd7\") pod \"console-5bcc9444db-nthkt\" (UID: \"5a7d2a25-558b-4082-b906-7eab8f6162e9\") " pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.794143 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.794119 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:39.917961 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:39.917900 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bcc9444db-nthkt"] Apr 22 18:01:39.919978 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:01:39.919950 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7d2a25_558b_4082_b906_7eab8f6162e9.slice/crio-3c7c05dbdc0466c3fc1b1c73ab9f53b617d3765d1a67158f2e3ca015f455ad3f WatchSource:0}: Error finding container 3c7c05dbdc0466c3fc1b1c73ab9f53b617d3765d1a67158f2e3ca015f455ad3f: Status 404 returned error can't find the container with id 3c7c05dbdc0466c3fc1b1c73ab9f53b617d3765d1a67158f2e3ca015f455ad3f Apr 22 18:01:40.437561 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:40.437480 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcc9444db-nthkt" event={"ID":"5a7d2a25-558b-4082-b906-7eab8f6162e9","Type":"ContainerStarted","Data":"b37438dad9f96814ebe16e30599c88fe3a176c8288be951815008bc199e3c125"} Apr 22 18:01:40.437561 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:40.437518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bcc9444db-nthkt" event={"ID":"5a7d2a25-558b-4082-b906-7eab8f6162e9","Type":"ContainerStarted","Data":"3c7c05dbdc0466c3fc1b1c73ab9f53b617d3765d1a67158f2e3ca015f455ad3f"} Apr 22 18:01:40.455804 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:40.455758 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bcc9444db-nthkt" podStartSLOduration=1.455744935 podStartE2EDuration="1.455744935s" podCreationTimestamp="2026-04-22 18:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:01:40.453389229 +0000 UTC m=+515.140849635" watchObservedRunningTime="2026-04-22 18:01:40.455744935 +0000 UTC m=+515.143205342" Apr 22 18:01:49.795148 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:49.795115 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:49.795516 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:49.795211 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:49.800602 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:49.800582 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:50.478492 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:50.478463 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bcc9444db-nthkt" Apr 22 18:01:50.567787 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:01:50.567757 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ccd7d4dc4-5qt5l"] Apr 22 18:02:14.474245 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.474213 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-kjbjt"] Apr 22 18:02:14.479286 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.479267 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.482388 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.482367 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:02:14.482503 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.482395 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:02:14.482747 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.482729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:02:14.485561 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.485543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-c6lxb\"" Apr 22 18:02:14.504006 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.503967 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-kjbjt"] Apr 22 18:02:14.549993 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.549967 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-2pw8c"] Apr 22 18:02:14.552273 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.552251 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.555567 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.555529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ddd5r\"" Apr 22 18:02:14.555567 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.555553 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:02:14.558172 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.558149 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01646ef7-6e90-483d-90a2-867852bde63e-cert\") pod \"kserve-controller-manager-644fd69db4-kjbjt\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.558285 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.558205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cp25\" (UniqueName: \"kubernetes.io/projected/01646ef7-6e90-483d-90a2-867852bde63e-kube-api-access-2cp25\") pod \"kserve-controller-manager-644fd69db4-kjbjt\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.573136 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.573112 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2pw8c"] Apr 22 18:02:14.658952 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.658920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9dzn\" (UniqueName: \"kubernetes.io/projected/519d0279-49b6-4228-ba0b-94e6b4bc62bb-kube-api-access-f9dzn\") pod \"seaweedfs-86cc847c5c-2pw8c\" (UID: \"519d0279-49b6-4228-ba0b-94e6b4bc62bb\") " pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.659065 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.658960 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/519d0279-49b6-4228-ba0b-94e6b4bc62bb-data\") pod \"seaweedfs-86cc847c5c-2pw8c\" (UID: \"519d0279-49b6-4228-ba0b-94e6b4bc62bb\") " pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.659124 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.659061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01646ef7-6e90-483d-90a2-867852bde63e-cert\") pod \"kserve-controller-manager-644fd69db4-kjbjt\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.659124 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.659100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cp25\" (UniqueName: \"kubernetes.io/projected/01646ef7-6e90-483d-90a2-867852bde63e-kube-api-access-2cp25\") pod \"kserve-controller-manager-644fd69db4-kjbjt\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.661390 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.661373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01646ef7-6e90-483d-90a2-867852bde63e-cert\") pod \"kserve-controller-manager-644fd69db4-kjbjt\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.674048 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.674026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cp25\" (UniqueName: \"kubernetes.io/projected/01646ef7-6e90-483d-90a2-867852bde63e-kube-api-access-2cp25\") pod \"kserve-controller-manager-644fd69db4-kjbjt\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.760393 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.760323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/519d0279-49b6-4228-ba0b-94e6b4bc62bb-data\") pod \"seaweedfs-86cc847c5c-2pw8c\" (UID: \"519d0279-49b6-4228-ba0b-94e6b4bc62bb\") " pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.760503 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.760436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9dzn\" (UniqueName: \"kubernetes.io/projected/519d0279-49b6-4228-ba0b-94e6b4bc62bb-kube-api-access-f9dzn\") pod \"seaweedfs-86cc847c5c-2pw8c\" (UID: \"519d0279-49b6-4228-ba0b-94e6b4bc62bb\") " pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.760668 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.760650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/519d0279-49b6-4228-ba0b-94e6b4bc62bb-data\") pod \"seaweedfs-86cc847c5c-2pw8c\" (UID: \"519d0279-49b6-4228-ba0b-94e6b4bc62bb\") " pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.769995 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.769970 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9dzn\" (UniqueName: \"kubernetes.io/projected/519d0279-49b6-4228-ba0b-94e6b4bc62bb-kube-api-access-f9dzn\") pod \"seaweedfs-86cc847c5c-2pw8c\" (UID: \"519d0279-49b6-4228-ba0b-94e6b4bc62bb\") " pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.789792 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.789771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:14.861468 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.861441 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:14.916640 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.916553 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-kjbjt"] Apr 22 18:02:14.920243 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:02:14.920216 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01646ef7_6e90_483d_90a2_867852bde63e.slice/crio-a326a420659372f61aca26a81b3c8e4515277bb41d1858ced0f4a7e90e8caaf2 WatchSource:0}: Error finding container a326a420659372f61aca26a81b3c8e4515277bb41d1858ced0f4a7e90e8caaf2: Status 404 returned error can't find the container with id a326a420659372f61aca26a81b3c8e4515277bb41d1858ced0f4a7e90e8caaf2 Apr 22 18:02:14.989670 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:14.989646 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2pw8c"] Apr 22 18:02:14.991406 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:02:14.991373 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519d0279_49b6_4228_ba0b_94e6b4bc62bb.slice/crio-1d28967c1ebd3f70fe2a8837b63f5ccd6b71971a47f0b2a657bd538ac294d869 WatchSource:0}: Error finding container 1d28967c1ebd3f70fe2a8837b63f5ccd6b71971a47f0b2a657bd538ac294d869: Status 404 returned error can't find the container with id 1d28967c1ebd3f70fe2a8837b63f5ccd6b71971a47f0b2a657bd538ac294d869 Apr 22 18:02:15.560782 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.560737 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2pw8c" event={"ID":"519d0279-49b6-4228-ba0b-94e6b4bc62bb","Type":"ContainerStarted","Data":"1d28967c1ebd3f70fe2a8837b63f5ccd6b71971a47f0b2a657bd538ac294d869"} Apr 22 18:02:15.561889 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.561855 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" event={"ID":"01646ef7-6e90-483d-90a2-867852bde63e","Type":"ContainerStarted","Data":"a326a420659372f61aca26a81b3c8e4515277bb41d1858ced0f4a7e90e8caaf2"} Apr 22 18:02:15.586162 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.586129 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6ccd7d4dc4-5qt5l" podUID="0fd84a93-9b95-48c2-86c0-720486cf66cd" containerName="console" containerID="cri-o://72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a" gracePeriod=15 Apr 22 18:02:15.843717 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.843687 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ccd7d4dc4-5qt5l_0fd84a93-9b95-48c2-86c0-720486cf66cd/console/0.log" Apr 22 18:02:15.843847 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.843763 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 18:02:15.972890 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.972861 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9czng\" (UniqueName: \"kubernetes.io/projected/0fd84a93-9b95-48c2-86c0-720486cf66cd-kube-api-access-9czng\") pod \"0fd84a93-9b95-48c2-86c0-720486cf66cd\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " Apr 22 18:02:15.973055 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.972904 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-oauth-config\") pod \"0fd84a93-9b95-48c2-86c0-720486cf66cd\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " Apr 22 18:02:15.973055 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.972971 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-service-ca\") pod \"0fd84a93-9b95-48c2-86c0-720486cf66cd\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " Apr 22 18:02:15.973055 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.972996 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-trusted-ca-bundle\") pod \"0fd84a93-9b95-48c2-86c0-720486cf66cd\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " Apr 22 18:02:15.973055 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.973028 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-oauth-serving-cert\") pod \"0fd84a93-9b95-48c2-86c0-720486cf66cd\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " Apr 22 18:02:15.973280 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.973058 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-config\") pod \"0fd84a93-9b95-48c2-86c0-720486cf66cd\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " Apr 22 18:02:15.973280 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.973087 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-serving-cert\") pod \"0fd84a93-9b95-48c2-86c0-720486cf66cd\" (UID: \"0fd84a93-9b95-48c2-86c0-720486cf66cd\") " Apr 22 18:02:15.973659 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.973601 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "0fd84a93-9b95-48c2-86c0-720486cf66cd" (UID: "0fd84a93-9b95-48c2-86c0-720486cf66cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:02:15.973766 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.973723 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0fd84a93-9b95-48c2-86c0-720486cf66cd" (UID: "0fd84a93-9b95-48c2-86c0-720486cf66cd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:02:15.973861 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.973816 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0fd84a93-9b95-48c2-86c0-720486cf66cd" (UID: "0fd84a93-9b95-48c2-86c0-720486cf66cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:02:15.973941 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.973923 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-config" (OuterVolumeSpecName: "console-config") pod "0fd84a93-9b95-48c2-86c0-720486cf66cd" (UID: "0fd84a93-9b95-48c2-86c0-720486cf66cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:02:15.975506 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.975472 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd84a93-9b95-48c2-86c0-720486cf66cd-kube-api-access-9czng" (OuterVolumeSpecName: "kube-api-access-9czng") pod "0fd84a93-9b95-48c2-86c0-720486cf66cd" (UID: "0fd84a93-9b95-48c2-86c0-720486cf66cd"). InnerVolumeSpecName "kube-api-access-9czng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:02:15.975638 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.975544 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0fd84a93-9b95-48c2-86c0-720486cf66cd" (UID: "0fd84a93-9b95-48c2-86c0-720486cf66cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:02:15.975638 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:15.975593 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0fd84a93-9b95-48c2-86c0-720486cf66cd" (UID: "0fd84a93-9b95-48c2-86c0-720486cf66cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:02:16.074032 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.074004 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9czng\" (UniqueName: \"kubernetes.io/projected/0fd84a93-9b95-48c2-86c0-720486cf66cd-kube-api-access-9czng\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.074032 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.074031 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-oauth-config\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.074190 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.074043 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-service-ca\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.074190 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.074056 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-trusted-ca-bundle\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.074190 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.074069 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-oauth-serving-cert\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.074190 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.074085 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-config\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.074190 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.074098 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd84a93-9b95-48c2-86c0-720486cf66cd-console-serving-cert\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:16.566116 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.566083 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ccd7d4dc4-5qt5l_0fd84a93-9b95-48c2-86c0-720486cf66cd/console/0.log" Apr 22 18:02:16.566484 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.566128 2574 generic.go:358] "Generic (PLEG): container finished" podID="0fd84a93-9b95-48c2-86c0-720486cf66cd" containerID="72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a" exitCode=2 Apr 22 18:02:16.566484 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.566199 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ccd7d4dc4-5qt5l" Apr 22 18:02:16.566484 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.566197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccd7d4dc4-5qt5l" event={"ID":"0fd84a93-9b95-48c2-86c0-720486cf66cd","Type":"ContainerDied","Data":"72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a"} Apr 22 18:02:16.566484 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.566301 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ccd7d4dc4-5qt5l" event={"ID":"0fd84a93-9b95-48c2-86c0-720486cf66cd","Type":"ContainerDied","Data":"d15fee5e0015b0988fc7e63e87e19975da90ec5cd799b1b7c60172906952b8a9"} Apr 22 18:02:16.566484 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.566321 2574 scope.go:117] "RemoveContainer" containerID="72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a" Apr 22 18:02:16.574913 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.574892 2574 scope.go:117] "RemoveContainer" containerID="72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a" Apr 22 18:02:16.575182 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:02:16.575164 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a\": container with ID starting with 72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a not found: ID does not exist" containerID="72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a" Apr 22 18:02:16.575251 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.575190 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a"} err="failed to get container status \"72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a\": rpc error: code = NotFound desc = could not find container \"72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a\": container with ID starting with 72b07df33574d2959329a3504620a273b1ee850055d545aeeafe9f0180a2f12a not found: ID does not exist" Apr 22 18:02:16.590170 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.590144 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ccd7d4dc4-5qt5l"] Apr 22 18:02:16.593370 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:16.593349 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6ccd7d4dc4-5qt5l"] Apr 22 18:02:17.856012 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:17.855974 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd84a93-9b95-48c2-86c0-720486cf66cd" path="/var/lib/kubelet/pods/0fd84a93-9b95-48c2-86c0-720486cf66cd/volumes" Apr 22 18:02:18.574878 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:18.574847 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2pw8c" event={"ID":"519d0279-49b6-4228-ba0b-94e6b4bc62bb","Type":"ContainerStarted","Data":"9ac5d35a8de8f5a5a8c30221e6808ed235268cfe554cabe203985b957f5258bc"} Apr 22 18:02:18.575028 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:18.574960 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:18.593943 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:18.593903 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-2pw8c" podStartSLOduration=1.974514584 podStartE2EDuration="4.593891693s" podCreationTimestamp="2026-04-22 18:02:14 +0000 UTC" firstStartedPulling="2026-04-22 18:02:14.992941252 +0000 UTC m=+549.680401638" lastFinishedPulling="2026-04-22 18:02:17.612318357 +0000 UTC m=+552.299778747" observedRunningTime="2026-04-22 18:02:18.592515283 +0000 UTC m=+553.279975689" watchObservedRunningTime="2026-04-22 18:02:18.593891693 +0000 UTC m=+553.281352098" Apr 22 18:02:23.593506 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:23.593470 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" event={"ID":"01646ef7-6e90-483d-90a2-867852bde63e","Type":"ContainerStarted","Data":"61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f"} Apr 22 18:02:23.593938 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:23.593580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:23.614456 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:23.614411 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" podStartSLOduration=1.071810726 podStartE2EDuration="9.614399377s" podCreationTimestamp="2026-04-22 18:02:14 +0000 UTC" firstStartedPulling="2026-04-22 18:02:14.921814686 +0000 UTC m=+549.609275088" lastFinishedPulling="2026-04-22 18:02:23.464403355 +0000 UTC m=+558.151863739" observedRunningTime="2026-04-22 18:02:23.612308917 +0000 UTC m=+558.299769333" watchObservedRunningTime="2026-04-22 18:02:23.614399377 +0000 UTC m=+558.301859809" Apr 22 18:02:24.580982 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:24.580947 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-2pw8c" Apr 22 18:02:50.095410 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.095368 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-kjbjt"] Apr 22 18:02:50.095809 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.095626 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" podUID="01646ef7-6e90-483d-90a2-867852bde63e" containerName="manager" containerID="cri-o://61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f" gracePeriod=10 Apr 22 18:02:50.118160 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.118135 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-fmdbg"] Apr 22 18:02:50.118445 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.118433 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fd84a93-9b95-48c2-86c0-720486cf66cd" containerName="console" Apr 22 18:02:50.118491 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.118446 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd84a93-9b95-48c2-86c0-720486cf66cd" containerName="console" Apr 22 18:02:50.118525 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.118520 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fd84a93-9b95-48c2-86c0-720486cf66cd" containerName="console" Apr 22 18:02:50.121591 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.121574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.129982 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.129961 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-fmdbg"] Apr 22 18:02:50.228074 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.228041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx984\" (UniqueName: \"kubernetes.io/projected/55a4b859-0707-4990-9340-093b91f0a22e-kube-api-access-qx984\") pod \"kserve-controller-manager-644fd69db4-fmdbg\" (UID: \"55a4b859-0707-4990-9340-093b91f0a22e\") " pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.228180 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.228157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55a4b859-0707-4990-9340-093b91f0a22e-cert\") pod \"kserve-controller-manager-644fd69db4-fmdbg\" (UID: \"55a4b859-0707-4990-9340-093b91f0a22e\") " pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.328573 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.328538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55a4b859-0707-4990-9340-093b91f0a22e-cert\") pod \"kserve-controller-manager-644fd69db4-fmdbg\" (UID: \"55a4b859-0707-4990-9340-093b91f0a22e\") " pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.328681 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.328616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx984\" (UniqueName: \"kubernetes.io/projected/55a4b859-0707-4990-9340-093b91f0a22e-kube-api-access-qx984\") pod \"kserve-controller-manager-644fd69db4-fmdbg\" (UID: \"55a4b859-0707-4990-9340-093b91f0a22e\") " pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.331129 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.331106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55a4b859-0707-4990-9340-093b91f0a22e-cert\") pod \"kserve-controller-manager-644fd69db4-fmdbg\" (UID: \"55a4b859-0707-4990-9340-093b91f0a22e\") " pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.331373 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.331359 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:50.338039 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.338017 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx984\" (UniqueName: \"kubernetes.io/projected/55a4b859-0707-4990-9340-093b91f0a22e-kube-api-access-qx984\") pod \"kserve-controller-manager-644fd69db4-fmdbg\" (UID: \"55a4b859-0707-4990-9340-093b91f0a22e\") " pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.429862 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.429777 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cp25\" (UniqueName: \"kubernetes.io/projected/01646ef7-6e90-483d-90a2-867852bde63e-kube-api-access-2cp25\") pod \"01646ef7-6e90-483d-90a2-867852bde63e\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " Apr 22 18:02:50.429986 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.429861 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01646ef7-6e90-483d-90a2-867852bde63e-cert\") pod \"01646ef7-6e90-483d-90a2-867852bde63e\" (UID: \"01646ef7-6e90-483d-90a2-867852bde63e\") " Apr 22 18:02:50.431978 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.431946 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01646ef7-6e90-483d-90a2-867852bde63e-kube-api-access-2cp25" (OuterVolumeSpecName: "kube-api-access-2cp25") pod "01646ef7-6e90-483d-90a2-867852bde63e" (UID: "01646ef7-6e90-483d-90a2-867852bde63e"). InnerVolumeSpecName "kube-api-access-2cp25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:02:50.431978 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.431963 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01646ef7-6e90-483d-90a2-867852bde63e-cert" (OuterVolumeSpecName: "cert") pod "01646ef7-6e90-483d-90a2-867852bde63e" (UID: "01646ef7-6e90-483d-90a2-867852bde63e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:02:50.484910 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.484887 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:50.531953 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.531777 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cp25\" (UniqueName: \"kubernetes.io/projected/01646ef7-6e90-483d-90a2-867852bde63e-kube-api-access-2cp25\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:50.531953 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.531805 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01646ef7-6e90-483d-90a2-867852bde63e-cert\") on node \"ip-10-0-132-24.ec2.internal\" DevicePath \"\"" Apr 22 18:02:50.617135 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.617105 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-fmdbg"] Apr 22 18:02:50.620502 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:02:50.620473 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55a4b859_0707_4990_9340_093b91f0a22e.slice/crio-99980ac04d5478c6f3a143c68d222ac63c732c72cf9ac546f691be50f32ed373 WatchSource:0}: Error finding container 99980ac04d5478c6f3a143c68d222ac63c732c72cf9ac546f691be50f32ed373: Status 404 returned error can't find the container with id 99980ac04d5478c6f3a143c68d222ac63c732c72cf9ac546f691be50f32ed373 Apr 22 18:02:50.684063 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.683999 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" event={"ID":"55a4b859-0707-4990-9340-093b91f0a22e","Type":"ContainerStarted","Data":"99980ac04d5478c6f3a143c68d222ac63c732c72cf9ac546f691be50f32ed373"} Apr 22 18:02:50.685042 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.685017 2574 generic.go:358] "Generic (PLEG): container finished" podID="01646ef7-6e90-483d-90a2-867852bde63e" containerID="61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f" exitCode=0 Apr 22 18:02:50.685124 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.685087 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" Apr 22 18:02:50.685174 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.685087 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" event={"ID":"01646ef7-6e90-483d-90a2-867852bde63e","Type":"ContainerDied","Data":"61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f"} Apr 22 18:02:50.685214 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.685185 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-kjbjt" event={"ID":"01646ef7-6e90-483d-90a2-867852bde63e","Type":"ContainerDied","Data":"a326a420659372f61aca26a81b3c8e4515277bb41d1858ced0f4a7e90e8caaf2"} Apr 22 18:02:50.685214 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.685205 2574 scope.go:117] "RemoveContainer" containerID="61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f" Apr 22 18:02:50.693016 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.692999 2574 scope.go:117] "RemoveContainer" containerID="61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f" Apr 22 18:02:50.693273 ip-10-0-132-24 kubenswrapper[2574]: E0422 18:02:50.693255 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f\": container with ID starting with 61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f not found: ID does not exist" containerID="61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f" Apr 22 18:02:50.693335 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.693280 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f"} err="failed to get container status \"61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f\": rpc error: code = NotFound desc = could not find container \"61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f\": container with ID starting with 61be3ca97cfa7df563b76c9c5cb18e37a97c3b238bbcd96e92bd9d8a318dc57f not found: ID does not exist" Apr 22 18:02:50.706811 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.706789 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-kjbjt"] Apr 22 18:02:50.708294 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:50.708276 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-kjbjt"] Apr 22 18:02:51.690059 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:51.690024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" event={"ID":"55a4b859-0707-4990-9340-093b91f0a22e","Type":"ContainerStarted","Data":"a2fb4ebc8f9b828687ebbbc68f15c7cec668a1115c7beadd56fe4cc8acb8296e"} Apr 22 18:02:51.690533 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:51.690142 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:02:51.706773 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:51.706732 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" podStartSLOduration=1.392601971 podStartE2EDuration="1.706720255s" podCreationTimestamp="2026-04-22 18:02:50 +0000 UTC" firstStartedPulling="2026-04-22 18:02:50.621656426 +0000 UTC m=+585.309116811" lastFinishedPulling="2026-04-22 18:02:50.935774705 +0000 UTC m=+585.623235095" observedRunningTime="2026-04-22 18:02:51.705804308 +0000 UTC m=+586.393264716" watchObservedRunningTime="2026-04-22 18:02:51.706720255 +0000 UTC m=+586.394180690" Apr 22 18:02:51.856034 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:02:51.856004 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01646ef7-6e90-483d-90a2-867852bde63e" path="/var/lib/kubelet/pods/01646ef7-6e90-483d-90a2-867852bde63e/volumes" Apr 22 18:03:05.786591 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:05.786565 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:03:05.786981 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:05.786569 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:03:22.698624 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:22.698545 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-644fd69db4-fmdbg" Apr 22 18:03:23.609886 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.609854 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-td6db"] Apr 22 18:03:23.610189 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.610178 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01646ef7-6e90-483d-90a2-867852bde63e" containerName="manager" Apr 22 18:03:23.610232 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.610191 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="01646ef7-6e90-483d-90a2-867852bde63e" containerName="manager" Apr 22 18:03:23.610266 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.610242 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="01646ef7-6e90-483d-90a2-867852bde63e" containerName="manager" Apr 22 18:03:23.613215 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.613191 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:23.617066 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.617043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:03:23.617173 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.617044 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-n5xww\"" Apr 22 18:03:23.626145 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.626122 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-td6db"] Apr 22 18:03:23.679021 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.679000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3af531fa-df90-4bdf-8278-1b9bca7a4846-cert\") pod \"odh-model-controller-696fc77849-td6db\" (UID: \"3af531fa-df90-4bdf-8278-1b9bca7a4846\") " pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:23.679128 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.679037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdd6\" (UniqueName: \"kubernetes.io/projected/3af531fa-df90-4bdf-8278-1b9bca7a4846-kube-api-access-qjdd6\") pod \"odh-model-controller-696fc77849-td6db\" (UID: \"3af531fa-df90-4bdf-8278-1b9bca7a4846\") " pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:23.780414 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.780391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3af531fa-df90-4bdf-8278-1b9bca7a4846-cert\") pod \"odh-model-controller-696fc77849-td6db\" (UID: \"3af531fa-df90-4bdf-8278-1b9bca7a4846\") " pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:23.780692 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.780430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdd6\" (UniqueName: \"kubernetes.io/projected/3af531fa-df90-4bdf-8278-1b9bca7a4846-kube-api-access-qjdd6\") pod \"odh-model-controller-696fc77849-td6db\" (UID: \"3af531fa-df90-4bdf-8278-1b9bca7a4846\") " pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:23.782695 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.782670 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3af531fa-df90-4bdf-8278-1b9bca7a4846-cert\") pod \"odh-model-controller-696fc77849-td6db\" (UID: \"3af531fa-df90-4bdf-8278-1b9bca7a4846\") " pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:23.790151 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.790124 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdd6\" (UniqueName: \"kubernetes.io/projected/3af531fa-df90-4bdf-8278-1b9bca7a4846-kube-api-access-qjdd6\") pod \"odh-model-controller-696fc77849-td6db\" (UID: \"3af531fa-df90-4bdf-8278-1b9bca7a4846\") " pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:23.925418 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:23.925328 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:24.051535 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:24.051508 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-td6db"] Apr 22 18:03:24.053477 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:03:24.053444 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af531fa_df90_4bdf_8278_1b9bca7a4846.slice/crio-4ff6a26fe357a7d1c025aee67ef32c91dbbeb7c0184a675944647e92a11886da WatchSource:0}: Error finding container 4ff6a26fe357a7d1c025aee67ef32c91dbbeb7c0184a675944647e92a11886da: Status 404 returned error can't find the container with id 4ff6a26fe357a7d1c025aee67ef32c91dbbeb7c0184a675944647e92a11886da Apr 22 18:03:24.811130 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:24.811090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-td6db" event={"ID":"3af531fa-df90-4bdf-8278-1b9bca7a4846","Type":"ContainerStarted","Data":"4ff6a26fe357a7d1c025aee67ef32c91dbbeb7c0184a675944647e92a11886da"} Apr 22 18:03:27.824640 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:27.824604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-td6db" event={"ID":"3af531fa-df90-4bdf-8278-1b9bca7a4846","Type":"ContainerStarted","Data":"da827986715077411825d517234a80166da40824a1d9c546c949b4d7ba0f1d0b"} Apr 22 18:03:27.825111 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:27.824731 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:27.841760 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:27.841721 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-td6db" podStartSLOduration=1.9634120849999999 podStartE2EDuration="4.841710516s" podCreationTimestamp="2026-04-22 18:03:23 +0000 UTC" firstStartedPulling="2026-04-22 18:03:24.054784833 +0000 UTC m=+618.742245218" lastFinishedPulling="2026-04-22 18:03:26.933083264 +0000 UTC m=+621.620543649" observedRunningTime="2026-04-22 18:03:27.841317409 +0000 UTC m=+622.528777820" watchObservedRunningTime="2026-04-22 18:03:27.841710516 +0000 UTC m=+622.529170923" Apr 22 18:03:38.829963 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:38.829933 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-td6db" Apr 22 18:03:59.262021 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.261990 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c"] Apr 22 18:03:59.267868 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.267826 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:03:59.270480 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.270447 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8rvmw\"" Apr 22 18:03:59.274145 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.274125 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c"] Apr 22 18:03:59.346242 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.346222 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7920cb-4208-424f-9c42-9037b4b7ddae-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c\" (UID: \"4d7920cb-4208-424f-9c42-9037b4b7ddae\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:03:59.447289 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.447258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7920cb-4208-424f-9c42-9037b4b7ddae-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c\" (UID: \"4d7920cb-4208-424f-9c42-9037b4b7ddae\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:03:59.447623 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.447602 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7920cb-4208-424f-9c42-9037b4b7ddae-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c\" (UID: \"4d7920cb-4208-424f-9c42-9037b4b7ddae\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:03:59.579811 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.579757 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:03:59.906106 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.906082 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c"] Apr 22 18:03:59.908068 ip-10-0-132-24 kubenswrapper[2574]: W0422 18:03:59.908035 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7920cb_4208_424f_9c42_9037b4b7ddae.slice/crio-57f0ad39b0c2463fcae2b239ccbe5d214f7ff195212b97c4cea7d761e96ce28d WatchSource:0}: Error finding container 57f0ad39b0c2463fcae2b239ccbe5d214f7ff195212b97c4cea7d761e96ce28d: Status 404 returned error can't find the container with id 57f0ad39b0c2463fcae2b239ccbe5d214f7ff195212b97c4cea7d761e96ce28d Apr 22 18:03:59.935004 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:03:59.934971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" event={"ID":"4d7920cb-4208-424f-9c42-9037b4b7ddae","Type":"ContainerStarted","Data":"57f0ad39b0c2463fcae2b239ccbe5d214f7ff195212b97c4cea7d761e96ce28d"} Apr 22 18:04:04.954017 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:04.953983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" event={"ID":"4d7920cb-4208-424f-9c42-9037b4b7ddae","Type":"ContainerStarted","Data":"7722c9da27e354d3768670874c0207674e9470c78b1f7852257c68bc16cbad09"} Apr 22 18:04:08.967321 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:08.967286 2574 generic.go:358] "Generic (PLEG): container finished" podID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerID="7722c9da27e354d3768670874c0207674e9470c78b1f7852257c68bc16cbad09" exitCode=0 Apr 22 18:04:08.967675 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:08.967341 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" event={"ID":"4d7920cb-4208-424f-9c42-9037b4b7ddae","Type":"ContainerDied","Data":"7722c9da27e354d3768670874c0207674e9470c78b1f7852257c68bc16cbad09"} Apr 22 18:04:22.018650 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:22.018560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" event={"ID":"4d7920cb-4208-424f-9c42-9037b4b7ddae","Type":"ContainerStarted","Data":"34ee2de2e2d1fd4b4531519f0a29dfe8dba6e0b7ec49a7cb220550d76eaa735f"} Apr 22 18:04:25.031379 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:25.031347 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" event={"ID":"4d7920cb-4208-424f-9c42-9037b4b7ddae","Type":"ContainerStarted","Data":"22db973d0b8c51317b574f0fd6038b1e7d129ee7d4fd0df0dfedf61c898ecdb1"} Apr 22 18:04:25.031759 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:25.031599 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:04:25.033190 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:25.033146 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:04:25.053004 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:25.052962 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podStartSLOduration=1.476957188 podStartE2EDuration="26.052951504s" podCreationTimestamp="2026-04-22 18:03:59 +0000 UTC" firstStartedPulling="2026-04-22 18:03:59.909776188 +0000 UTC m=+654.597236572" lastFinishedPulling="2026-04-22 18:04:24.48577049 +0000 UTC m=+679.173230888" observedRunningTime="2026-04-22 18:04:25.050798946 +0000 UTC m=+679.738259347" watchObservedRunningTime="2026-04-22 18:04:25.052951504 +0000 UTC m=+679.740411910" Apr 22 18:04:26.034760 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:26.034727 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:04:26.035170 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:26.034893 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:04:26.035669 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:26.035645 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:04:27.037757 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:27.037710 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:04:27.038138 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:27.038041 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:04:37.038711 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:37.038645 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:04:37.039221 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:37.039069 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:04:47.037897 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:47.037782 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:04:47.038292 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:47.038263 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:04:57.037969 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:57.037914 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:04:57.038449 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:04:57.038304 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:05:07.038383 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:05:07.038337 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:05:07.038880 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:05:07.038858 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:05:17.037892 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:05:17.037809 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 18:05:17.038342 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:05:17.038320 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" podUID="4d7920cb-4208-424f-9c42-9037b4b7ddae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:05:27.038933 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:05:27.038902 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:05:27.039308 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:05:27.039013 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c" Apr 22 18:08:05.811297 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:08:05.811269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:08:05.811934 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:08:05.811917 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:13:05.839432 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:13:05.839394 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:13:05.841940 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:13:05.841522 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:18:05.870258 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:18:05.870141 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:18:05.874150 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:18:05.873313 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:23:05.893634 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:23:05.893516 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:23:05.900334 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:23:05.897158 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:28:05.915897 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:28:05.915762 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:28:05.920165 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:28:05.920147 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:33:05.944287 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:33:05.944182 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:33:05.948205 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:33:05.948124 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:38:05.967802 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:38:05.967681 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:38:05.972654 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:38:05.972633 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:43:05.989826 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:43:05.989798 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:43:06.001685 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:43:06.001662 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:48:06.018324 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:48:06.018200 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:48:06.026032 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:48:06.026008 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:53:06.040884 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:53:06.040754 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:53:06.049394 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:53:06.049377 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:58:06.062220 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:58:06.062111 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 18:58:06.071737 ip-10-0-132-24 kubenswrapper[2574]: I0422 18:58:06.071720 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:03:06.084501 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:03:06.084396 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:03:06.096644 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:03:06.096622 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:04:02.267292 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:04:02.267262 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-644fd69db4-fmdbg_55a4b859-0707-4990-9340-093b91f0a22e/manager/0.log" Apr 22 19:07:02.786441 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:07:02.786409 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-644fd69db4-fmdbg_55a4b859-0707-4990-9340-093b91f0a22e/manager/0.log" Apr 22 19:08:06.106967 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:08:06.106864 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:08:06.119353 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:08:06.119333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:13:06.130103 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:13:06.129994 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:13:06.143320 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:13:06.143294 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:18:06.153738 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:18:06.153619 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:18:06.167606 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:18:06.167580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:23:06.176199 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:23:06.176096 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:23:06.191576 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:23:06.191552 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:27:19.113005 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:19.112923 2574 ???:1] "http: TLS handshake error from 10.0.135.143:57384: EOF" Apr 22 19:27:19.113517 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:19.113254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:19.140826 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:19.140803 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:19.152967 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:19.152946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:19.642568 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:19.642540 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:19.655685 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:19.655650 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:19.668377 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:19.668355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:20.164383 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:20.164345 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:20.178246 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:20.178224 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:20.190167 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:20.190150 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:20.650233 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:20.650204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:20.661253 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:20.661231 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:20.672734 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:20.672716 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:21.143889 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:21.143864 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:21.154352 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:21.154330 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:21.164802 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:21.164770 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:21.619137 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:21.619098 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:21.633190 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:21.633160 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:21.645525 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:21.645499 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:22.105321 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:22.105291 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:22.117543 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:22.117515 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:22.128673 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:22.128646 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:22.584542 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:22.584515 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:22.595980 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:22.595959 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:22.606873 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:22.606856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:23.066232 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:23.066204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:23.077827 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:23.077804 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:23.088247 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:23.088229 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:23.551417 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:23.551390 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:23.562606 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:23.562588 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:23.574239 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:23.574215 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:24.025428 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:24.025405 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:24.035211 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:24.035186 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:24.046652 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:24.046631 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:24.542055 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:24.542026 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:24.553276 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:24.553256 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:24.564187 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:24.564170 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:25.067410 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:25.067385 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:25.081715 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:25.081694 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:25.094450 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:25.094430 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:25.549640 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:25.549605 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:25.560643 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:25.560618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:25.571142 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:25.571120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:26.046310 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:26.046254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:26.057753 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:26.057712 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:26.068878 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:26.068860 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:26.576790 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:26.576765 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:26.589016 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:26.588992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:26.600216 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:26.600195 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:27.097804 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:27.097769 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:27.109916 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:27.109875 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:27.121478 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:27.121460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:27.610881 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:27.610845 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:27.621115 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:27.621087 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:27.631632 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:27.631614 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:28.095502 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:28.095464 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:28.106751 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:28.106730 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:28.118464 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:28.118448 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:28.575724 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:28.575692 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:28.587648 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:28.587626 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:28.599039 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:28.599023 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:29.080131 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:29.080109 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/kserve-container/0.log" Apr 22 19:27:29.094338 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:29.094285 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/agent/0.log" Apr 22 19:27:29.105291 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:29.105255 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-af6af-predictor-6bddc89fbc-slm4c_4d7920cb-4208-424f-9c42-9037b4b7ddae/storage-initializer/0.log" Apr 22 19:27:33.618792 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:33.618762 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jpw95_9926e4c9-979f-42d2-a480-34e0d7b96299/global-pull-secret-syncer/0.log" Apr 22 19:27:33.767189 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:33.767140 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lmpbm_0f0aa83e-c1fb-48e4-b074-4915c38e5138/konnectivity-agent/0.log" Apr 22 19:27:33.855784 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:33.855758 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-24.ec2.internal_017fe086330ddbb9d8db4ca21ed5605a/haproxy/0.log" Apr 22 19:27:37.288571 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.288543 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qqshn_a4cdefe3-79ae-40ff-95ad-5f7ed643723e/kube-state-metrics/0.log" Apr 22 19:27:37.308913 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.308883 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qqshn_a4cdefe3-79ae-40ff-95ad-5f7ed643723e/kube-rbac-proxy-main/0.log" Apr 22 19:27:37.331472 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.331433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qqshn_a4cdefe3-79ae-40ff-95ad-5f7ed643723e/kube-rbac-proxy-self/0.log" Apr 22 19:27:37.362384 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.362364 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7d7d964cb8-z9699_114c0cb9-c1db-4db0-b0fc-5585476a4502/metrics-server/0.log" Apr 22 19:27:37.421763 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.421742 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l9d29_7c89d511-a6e3-4823-b018-ec96f670c05c/node-exporter/0.log" Apr 22 19:27:37.444364 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.444346 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l9d29_7c89d511-a6e3-4823-b018-ec96f670c05c/kube-rbac-proxy/0.log" Apr 22 19:27:37.466556 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.466537 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l9d29_7c89d511-a6e3-4823-b018-ec96f670c05c/init-textfile/0.log" Apr 22 19:27:37.899318 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.899269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-4vggv_a7e4f65e-1162-4107-9677-62f7613ed0f5/prometheus-operator/0.log" Apr 22 19:27:37.930519 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:37.930500 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-4vggv_a7e4f65e-1162-4107-9677-62f7613ed0f5/kube-rbac-proxy/0.log" Apr 22 19:27:39.293588 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:39.293560 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wx69w_2f935fe0-30e5-4e30-8ea9-ea209b4859e3/networking-console-plugin/0.log" Apr 22 19:27:40.059275 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.059249 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bcc9444db-nthkt_5a7d2a25-558b-4082-b906-7eab8f6162e9/console/0.log" Apr 22 19:27:40.095589 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.095563 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-vsfb6_2c1e09a8-f37d-488d-875f-501a78bdc7b1/download-server/0.log" Apr 22 19:27:40.443528 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.443454 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-6lkst_7e087626-eed9-4af5-a0e7-65ed53ddb4a0/volume-data-source-validator/0.log" Apr 22 19:27:40.599261 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.599232 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d"] Apr 22 19:27:40.602597 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.602581 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.605092 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.605070 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtxx5\"/\"openshift-service-ca.crt\"" Apr 22 19:27:40.605225 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.605138 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtxx5\"/\"kube-root-ca.crt\"" Apr 22 19:27:40.606582 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.606562 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtxx5\"/\"default-dockercfg-jqt9f\"" Apr 22 19:27:40.608373 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.608348 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d"] Apr 22 19:27:40.636149 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.636128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-podres\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.636250 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.636162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-sys\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.636250 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.636185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-lib-modules\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.636359 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.636259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnkw\" (UniqueName: \"kubernetes.io/projected/3cfee4de-f3d0-422f-94d8-4fd0a201545d-kube-api-access-9jnkw\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.636359 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.636329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-proc\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737490 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnkw\" (UniqueName: \"kubernetes.io/projected/3cfee4de-f3d0-422f-94d8-4fd0a201545d-kube-api-access-9jnkw\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737623 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737498 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-proc\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737623 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-podres\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737623 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-sys\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737623 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-lib-modules\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737623 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-proc\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737810 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-sys\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737810 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-podres\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.737810 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.737734 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cfee4de-f3d0-422f-94d8-4fd0a201545d-lib-modules\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.746204 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.746186 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnkw\" (UniqueName: \"kubernetes.io/projected/3cfee4de-f3d0-422f-94d8-4fd0a201545d-kube-api-access-9jnkw\") pod \"perf-node-gather-daemonset-v9d7d\" (UID: \"3cfee4de-f3d0-422f-94d8-4fd0a201545d\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:40.913377 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:40.913353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:41.030376 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.030358 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d"] Apr 22 19:27:41.032672 ip-10-0-132-24 kubenswrapper[2574]: W0422 19:27:41.032636 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3cfee4de_f3d0_422f_94d8_4fd0a201545d.slice/crio-bde765608de35f7641110823a4aff55b6c24b9cc37924d36b9ace48ea28407c5 WatchSource:0}: Error finding container bde765608de35f7641110823a4aff55b6c24b9cc37924d36b9ace48ea28407c5: Status 404 returned error can't find the container with id bde765608de35f7641110823a4aff55b6c24b9cc37924d36b9ace48ea28407c5 Apr 22 19:27:41.034269 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.034253 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:27:41.062791 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.062767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5z57j_18c1f8e2-48fe-42d1-bfbc-436a196841e4/dns/0.log" Apr 22 19:27:41.084472 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.084446 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5z57j_18c1f8e2-48fe-42d1-bfbc-436a196841e4/kube-rbac-proxy/0.log" Apr 22 19:27:41.227577 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.227550 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hbjmn_95190e1c-03c4-4dcf-b739-7c181cb38f82/dns-node-resolver/0.log" Apr 22 19:27:41.607049 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.607021 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8688656c7d-nlp4j_1d5e5c18-bc99-45de-811b-fd57940d36f8/registry/0.log" Apr 22 19:27:41.651875 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.651851 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nzqmg_a684f094-f2e9-4f18-b33b-e466f94313d8/node-ca/0.log" Apr 22 19:27:41.996354 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.996327 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" event={"ID":"3cfee4de-f3d0-422f-94d8-4fd0a201545d","Type":"ContainerStarted","Data":"657aac00b12fe2f1f61666e2d27678e35d5f873f39b464cae64b1abd90a668a0"} Apr 22 19:27:41.996354 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.996357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" event={"ID":"3cfee4de-f3d0-422f-94d8-4fd0a201545d","Type":"ContainerStarted","Data":"bde765608de35f7641110823a4aff55b6c24b9cc37924d36b9ace48ea28407c5"} Apr 22 19:27:41.996566 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:41.996423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:42.012208 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:42.012168 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" podStartSLOduration=2.012157539 podStartE2EDuration="2.012157539s" podCreationTimestamp="2026-04-22 19:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:42.011102714 +0000 UTC m=+5676.698563124" watchObservedRunningTime="2026-04-22 19:27:42.012157539 +0000 UTC m=+5676.699617945" Apr 22 19:27:42.616130 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:42.616101 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-85lcr_0a435446-c735-4a7b-bbb9-eab6af3f7b77/serve-healthcheck-canary/0.log" Apr 22 19:27:43.023092 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:43.023062 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6w9nf_0f7c8463-f514-4135-ac39-19258a51ead6/kube-rbac-proxy/0.log" Apr 22 19:27:43.041943 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:43.041915 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6w9nf_0f7c8463-f514-4135-ac39-19258a51ead6/exporter/0.log" Apr 22 19:27:43.061176 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:43.061158 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6w9nf_0f7c8463-f514-4135-ac39-19258a51ead6/extractor/0.log" Apr 22 19:27:45.019582 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:45.019549 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-644fd69db4-fmdbg_55a4b859-0707-4990-9340-093b91f0a22e/manager/0.log" Apr 22 19:27:45.130896 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:45.130863 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-td6db_3af531fa-df90-4bdf-8278-1b9bca7a4846/manager/0.log" Apr 22 19:27:45.186754 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:45.186725 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-2pw8c_519d0279-49b6-4228-ba0b-94e6b4bc62bb/seaweedfs/0.log" Apr 22 19:27:48.009446 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:48.009420 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-v9d7d" Apr 22 19:27:49.161739 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:49.161709 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-82zvn_3c91290f-1a67-4f2b-bb75-f6e0647e34d5/kube-storage-version-migrator-operator/1.log" Apr 22 19:27:49.162609 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:49.162580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-82zvn_3c91290f-1a67-4f2b-bb75-f6e0647e34d5/kube-storage-version-migrator-operator/0.log" Apr 22 19:27:50.217801 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.217774 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-972f2_f0fd569b-4e3e-4771-8dec-d6f16a52e2b9/kube-multus-additional-cni-plugins/0.log" Apr 22 19:27:50.241587 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.241572 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-972f2_f0fd569b-4e3e-4771-8dec-d6f16a52e2b9/egress-router-binary-copy/0.log" Apr 22 19:27:50.259941 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.259917 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-972f2_f0fd569b-4e3e-4771-8dec-d6f16a52e2b9/cni-plugins/0.log" Apr 22 19:27:50.279554 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.279537 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-972f2_f0fd569b-4e3e-4771-8dec-d6f16a52e2b9/bond-cni-plugin/0.log" Apr 22 19:27:50.333512 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.333493 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-972f2_f0fd569b-4e3e-4771-8dec-d6f16a52e2b9/routeoverride-cni/0.log" Apr 22 19:27:50.360300 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.360283 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-972f2_f0fd569b-4e3e-4771-8dec-d6f16a52e2b9/whereabouts-cni-bincopy/0.log" Apr 22 19:27:50.395437 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.395415 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-972f2_f0fd569b-4e3e-4771-8dec-d6f16a52e2b9/whereabouts-cni/0.log" Apr 22 19:27:50.727736 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.727700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cq6p9_64e9c497-2c3a-4764-89fd-29dff8b7c4b1/kube-multus/0.log" Apr 22 19:27:50.816886 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.816861 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fztfm_027c3a56-b141-4f0e-beda-4bbc2fdc45c6/network-metrics-daemon/0.log" Apr 22 19:27:50.834654 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:50.834630 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fztfm_027c3a56-b141-4f0e-beda-4bbc2fdc45c6/kube-rbac-proxy/0.log" Apr 22 19:27:52.277888 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.277792 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-controller/0.log" Apr 22 19:27:52.295382 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.295355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/0.log" Apr 22 19:27:52.319054 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.319034 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovn-acl-logging/1.log" Apr 22 19:27:52.335395 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.335372 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/kube-rbac-proxy-node/0.log" Apr 22 19:27:52.355279 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.355258 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:27:52.372098 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.372076 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/northd/0.log" Apr 22 19:27:52.390389 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.390369 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/nbdb/0.log" Apr 22 19:27:52.409243 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.409228 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/sbdb/0.log" Apr 22 19:27:52.496087 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:52.496066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zrmhz_4abe7788-23bd-436c-bc7c-1de96634aa32/ovnkube-controller/0.log" Apr 22 19:27:53.436770 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:53.436739 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-5t9k8_b3c615fe-9b43-49f7-b16b-8bb8c1710870/check-endpoints/0.log" Apr 22 19:27:53.457999 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:53.457972 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4jvnk_eb07fcd6-cc65-437c-9bc0-d210593e3edf/network-check-target-container/0.log" Apr 22 19:27:54.349674 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:54.349645 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lql7j_2fa19584-980e-4ca9-a0f2-25e2e3bd0ba6/iptables-alerter/0.log" Apr 22 19:27:55.029408 ip-10-0-132-24 kubenswrapper[2574]: I0422 19:27:55.029384 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-np549_5b4c353a-3fa1-44c0-954e-74df34b1b224/tuned/0.log"